Oct 02 09:32:22 crc systemd[1]: Starting Kubernetes Kubelet... Oct 02 09:32:22 crc restorecon[4673]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 09:32:22 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 09:32:23 crc restorecon[4673]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 09:32:23 crc restorecon[4673]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 02 09:32:24 crc kubenswrapper[4722]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 02 09:32:24 crc kubenswrapper[4722]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 02 09:32:24 crc kubenswrapper[4722]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 02 09:32:24 crc kubenswrapper[4722]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 02 09:32:24 crc kubenswrapper[4722]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 02 09:32:24 crc kubenswrapper[4722]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.473538 4722 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477449 4722 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477475 4722 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477480 4722 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477484 4722 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477489 4722 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477493 4722 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477498 4722 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477503 4722 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477514 4722 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477518 4722 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477523 4722 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477526 4722 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477530 4722 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477534 4722 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477538 4722 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477543 4722 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477547 4722 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477551 4722 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477554 4722 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477558 4722 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477561 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477565 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477568 4722 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477571 4722 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477575 4722 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477578 4722 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477582 4722 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477585 4722 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477588 4722 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477592 4722 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477597 4722 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477602 4722 feature_gate.go:330] unrecognized feature gate: Example Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477605 4722 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477609 4722 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477613 4722 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477617 4722 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477621 4722 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477625 4722 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477628 4722 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477632 4722 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477635 4722 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477639 4722 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477642 4722 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477649 4722 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477652 4722 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477656 4722 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477659 4722 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477663 4722 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477666 4722 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477669 4722 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477673 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477676 4722 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477680 4722 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477683 4722 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477686 4722 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477690 4722 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477693 4722 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477696 4722 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477700 4722 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477703 4722 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477706 4722 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477709 4722 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477714 4722 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477717 4722 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477722 4722 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477726 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477730 4722 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477734 4722 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477738 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477741 4722 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.477745 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.478680 4722 flags.go:64] FLAG: --address="0.0.0.0" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.478696 4722 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479511 4722 flags.go:64] FLAG: --anonymous-auth="true" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479531 4722 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479542 4722 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479549 4722 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479561 4722 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479580 4722 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479590 4722 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479608 4722 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479616 4722 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479623 4722 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479630 4722 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479637 4722 flags.go:64] FLAG: --cgroup-root="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479642 4722 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479648 4722 flags.go:64] FLAG: --client-ca-file="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479653 4722 flags.go:64] FLAG: --cloud-config="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479659 4722 flags.go:64] FLAG: --cloud-provider="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479670 4722 flags.go:64] FLAG: --cluster-dns="[]" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479685 4722 flags.go:64] FLAG: --cluster-domain="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479691 4722 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479699 4722 flags.go:64] FLAG: --config-dir="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479706 4722 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479714 4722 flags.go:64] FLAG: --container-log-max-files="5" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479725 4722 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479732 4722 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479743 4722 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479749 4722 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479755 4722 flags.go:64] FLAG: --contention-profiling="false" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479761 4722 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479767 4722 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479773 4722 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479779 4722 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479790 4722 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479796 4722 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479809 4722 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479815 4722 flags.go:64] FLAG: --enable-load-reader="false" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479822 4722 flags.go:64] FLAG: --enable-server="true" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479831 4722 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479841 4722 flags.go:64] FLAG: --event-burst="100" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479847 4722 flags.go:64] FLAG: --event-qps="50" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479853 4722 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479859 4722 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479870 4722 flags.go:64] FLAG: --eviction-hard="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479878 4722 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479886 4722 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479892 4722 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479898 4722 flags.go:64] FLAG: --eviction-soft="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479904 4722 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479909 4722 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479915 4722 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479950 4722 flags.go:64] FLAG: --experimental-mounter-path="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479960 4722 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479966 4722 flags.go:64] FLAG: --fail-swap-on="true" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479972 4722 flags.go:64] FLAG: --feature-gates="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479979 4722 flags.go:64] FLAG: --file-check-frequency="20s" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479986 4722 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479993 4722 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.479999 4722 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480006 4722 flags.go:64] FLAG: --healthz-port="10248" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480015 4722 flags.go:64] FLAG: --help="false" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480020 4722 flags.go:64] FLAG: --hostname-override="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480025 4722 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480031 4722 flags.go:64] FLAG: --http-check-frequency="20s" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480036 4722 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480041 4722 flags.go:64] FLAG: --image-credential-provider-config="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480046 4722 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480051 4722 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480056 4722 flags.go:64] FLAG: --image-service-endpoint="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480065 4722 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480070 4722 flags.go:64] FLAG: --kube-api-burst="100" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480075 4722 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480081 4722 flags.go:64] FLAG: --kube-api-qps="50" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480086 4722 flags.go:64] FLAG: --kube-reserved="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480094 4722 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480099 4722 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480104 4722 flags.go:64] FLAG: --kubelet-cgroups="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480108 4722 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480117 4722 flags.go:64] FLAG: --lock-file="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480122 4722 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480127 4722 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480133 4722 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480144 4722 flags.go:64] FLAG: --log-json-split-stream="false" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480150 4722 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480155 4722 flags.go:64] FLAG: --log-text-split-stream="false" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480161 4722 flags.go:64] FLAG: --logging-format="text" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480171 4722 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480178 4722 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480186 4722 flags.go:64] FLAG: --manifest-url="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480194 4722 flags.go:64] FLAG: --manifest-url-header="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480217 4722 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480223 4722 flags.go:64] FLAG: --max-open-files="1000000" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480261 4722 flags.go:64] FLAG: --max-pods="110" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480265 4722 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480270 4722 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480274 4722 flags.go:64] FLAG: --memory-manager-policy="None" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480279 4722 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480509 4722 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480518 4722 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480523 4722 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480544 4722 flags.go:64] FLAG: --node-status-max-images="50" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480549 4722 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480554 4722 flags.go:64] FLAG: --oom-score-adj="-999" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480559 4722 flags.go:64] FLAG: --pod-cidr="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480564 4722 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480574 4722 flags.go:64] FLAG: --pod-manifest-path="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480579 4722 flags.go:64] FLAG: --pod-max-pids="-1" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480584 4722 flags.go:64] FLAG: --pods-per-core="0" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480588 4722 flags.go:64] FLAG: --port="10250" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480593 4722 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480598 4722 flags.go:64] FLAG: --provider-id="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480603 4722 flags.go:64] FLAG: --qos-reserved="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480607 4722 flags.go:64] FLAG: --read-only-port="10255" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480612 4722 flags.go:64] FLAG: --register-node="true" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480616 4722 flags.go:64] FLAG: --register-schedulable="true" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480621 4722 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480631 4722 flags.go:64] FLAG: --registry-burst="10" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480636 4722 flags.go:64] FLAG: --registry-qps="5" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480640 4722 flags.go:64] FLAG: --reserved-cpus="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480644 4722 flags.go:64] FLAG: --reserved-memory="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480667 4722 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480672 4722 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480677 4722 flags.go:64] FLAG: --rotate-certificates="false" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480681 4722 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480685 4722 flags.go:64] FLAG: --runonce="false" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480690 4722 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480694 4722 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480699 4722 flags.go:64] FLAG: --seccomp-default="false" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480703 4722 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480707 4722 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480712 4722 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480718 4722 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480723 4722 flags.go:64] FLAG: --storage-driver-password="root" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480727 4722 flags.go:64] FLAG: --storage-driver-secure="false" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480732 4722 flags.go:64] FLAG: --storage-driver-table="stats" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480736 4722 flags.go:64] FLAG: --storage-driver-user="root" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480741 4722 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480745 4722 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480750 4722 flags.go:64] FLAG: --system-cgroups="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480754 4722 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480763 4722 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480767 4722 flags.go:64] FLAG: --tls-cert-file="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480771 4722 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480780 4722 flags.go:64] FLAG: --tls-min-version="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480784 4722 flags.go:64] FLAG: --tls-private-key-file="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480789 4722 flags.go:64] FLAG: --topology-manager-policy="none" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480802 4722 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480807 4722 flags.go:64] FLAG: --topology-manager-scope="container" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480812 4722 flags.go:64] FLAG: --v="2" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480840 4722 flags.go:64] FLAG: --version="false" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480847 4722 flags.go:64] FLAG: --vmodule="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480857 4722 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.480868 4722 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481022 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481030 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481034 4722 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481041 4722 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481046 4722 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481051 4722 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481054 4722 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481059 4722 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481063 4722 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481067 4722 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481073 4722 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481078 4722 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481082 4722 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481086 4722 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481091 4722 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481096 4722 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481100 4722 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481103 4722 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481107 4722 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481120 4722 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481124 4722 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481128 4722 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481131 4722 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481135 4722 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481138 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481142 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481146 4722 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481149 4722 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481153 4722 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481157 4722 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481160 4722 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481165 4722 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481168 4722 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481172 4722 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481176 4722 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481180 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481183 4722 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481187 4722 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481190 4722 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481194 4722 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481199 4722 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481203 4722 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481207 4722 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481210 4722 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481214 4722 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481217 4722 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481220 4722 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481224 4722 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481227 4722 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481231 4722 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481234 4722 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481238 4722 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481241 4722 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481246 4722 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481250 4722 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481254 4722 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481258 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481262 4722 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481266 4722 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481270 4722 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481273 4722 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481278 4722 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481283 4722 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481288 4722 feature_gate.go:330] unrecognized feature gate: Example Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481292 4722 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481296 4722 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481300 4722 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481303 4722 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481307 4722 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481324 4722 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.481328 4722 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.481336 4722 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.490486 4722 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.490540 4722 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490634 4722 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490653 4722 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490661 4722 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490673 4722 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490679 4722 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490685 4722 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490690 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490695 4722 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490700 4722 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490705 4722 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490710 4722 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490714 4722 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490718 4722 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490724 4722 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490730 4722 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490737 4722 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490744 4722 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490749 4722 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490755 4722 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490760 4722 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490765 4722 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490770 4722 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490774 4722 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490780 4722 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490785 4722 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490790 4722 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490795 4722 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490799 4722 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490804 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490811 4722 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490817 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490823 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490829 4722 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490835 4722 feature_gate.go:330] unrecognized feature gate: Example Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490841 4722 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490846 4722 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490851 4722 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490856 4722 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490861 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490868 4722 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490874 4722 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490880 4722 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490885 4722 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490890 4722 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490895 4722 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490901 4722 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490906 4722 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490912 4722 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490918 4722 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490923 4722 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490928 4722 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490933 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490939 4722 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490946 4722 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490951 4722 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490956 4722 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490961 4722 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490965 4722 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490970 4722 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490975 4722 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490980 4722 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490984 4722 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490988 4722 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490993 4722 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.490998 4722 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491003 4722 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491008 4722 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491012 4722 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491016 4722 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491021 4722 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491027 4722 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.491035 4722 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491206 4722 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491216 4722 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491220 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491225 4722 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491228 4722 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491232 4722 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491235 4722 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491239 4722 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491243 4722 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491247 4722 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491250 4722 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491254 4722 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491257 4722 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491262 4722 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491266 4722 feature_gate.go:330] unrecognized feature gate: Example Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491270 4722 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491275 4722 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491280 4722 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491284 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491291 4722 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491299 4722 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491304 4722 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491308 4722 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491333 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491338 4722 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491342 4722 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491346 4722 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491351 4722 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491356 4722 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491360 4722 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491364 4722 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491369 4722 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491374 4722 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491379 4722 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491391 4722 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491396 4722 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491400 4722 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491405 4722 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491411 4722 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491416 4722 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491421 4722 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491428 4722 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491433 4722 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491438 4722 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491442 4722 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491447 4722 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491452 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491456 4722 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491461 4722 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491467 4722 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491471 4722 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491476 4722 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491480 4722 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491484 4722 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491489 4722 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491493 4722 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491497 4722 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491502 4722 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491508 4722 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491513 4722 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491518 4722 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491523 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491528 4722 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491532 4722 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491539 4722 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491543 4722 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491548 4722 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491553 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491558 4722 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491562 4722 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.491574 4722 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.491582 4722 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.493429 4722 server.go:940] "Client rotation is on, will bootstrap in background" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.498078 4722 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.498179 4722 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.500160 4722 server.go:997] "Starting client certificate rotation" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.500190 4722 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.501327 4722 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-29 21:05:30.101089697 +0000 UTC Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.501404 4722 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1403h33m5.599688553s for next certificate rotation Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.541446 4722 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.543514 4722 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.560866 4722 log.go:25] "Validated CRI v1 runtime API" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.595676 4722 log.go:25] "Validated CRI v1 image API" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.597580 4722 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.603902 4722 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-02-09-28-45-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.603938 4722 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.622030 4722 manager.go:217] Machine: {Timestamp:2025-10-02 09:32:24.618661792 +0000 UTC m=+0.655414792 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:8b109f7e-b539-4abd-be49-fd9e033b3339 BootID:680c7bd3-d5b0-4b5c-92c1-2fb03a628b44 Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:8a:95:6c Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:8a:95:6c Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:f8:17:13 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:84:a9:4f Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:48:d2:81 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:df:32:47 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:12:5d:e4:28:2d:bc Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:6e:64:9e:a5:3b:9a Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.622267 4722 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.622474 4722 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.622870 4722 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.623067 4722 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.623104 4722 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.623346 4722 topology_manager.go:138] "Creating topology manager with none policy" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.623356 4722 container_manager_linux.go:303] "Creating device plugin manager" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.624304 4722 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.624349 4722 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.625261 4722 state_mem.go:36] "Initialized new in-memory state store" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.625466 4722 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.633251 4722 kubelet.go:418] "Attempting to sync node with API server" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.633274 4722 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.633299 4722 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.633334 4722 kubelet.go:324] "Adding apiserver pod source" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.633356 4722 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.638641 4722 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.639425 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.131:6443: connect: connection refused Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.639470 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.131:6443: connect: connection refused Oct 02 09:32:24 crc kubenswrapper[4722]: E1002 09:32:24.639529 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.131:6443: connect: connection refused" logger="UnhandledError" Oct 02 09:32:24 crc kubenswrapper[4722]: E1002 09:32:24.639550 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.131:6443: connect: connection refused" logger="UnhandledError" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.639945 4722 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.642562 4722 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.643987 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.644016 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.644026 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.644041 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.644055 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.644065 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.644074 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.644087 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.644099 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.644108 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.644122 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.644130 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.645455 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.645844 4722 server.go:1280] "Started kubelet" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.646862 4722 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.646859 4722 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 02 09:32:24 crc systemd[1]: Started Kubernetes Kubelet. Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.647660 4722 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.647989 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.131:6443: connect: connection refused Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.648418 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.648523 4722 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.648547 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 18:26:23.537785521 +0000 UTC Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.648593 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1184h53m58.889195606s for next certificate rotation Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.654200 4722 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 02 09:32:24 crc kubenswrapper[4722]: E1002 09:32:24.654241 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.654683 4722 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.654886 4722 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 02 09:32:24 crc kubenswrapper[4722]: E1002 09:32:24.655375 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.131:6443: connect: connection refused" interval="200ms" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.657084 4722 factory.go:55] Registering systemd factory Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.657118 4722 factory.go:221] Registration of the systemd container factory successfully Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.657491 4722 factory.go:153] Registering CRI-O factory Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.657582 4722 factory.go:221] Registration of the crio container factory successfully Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.657458 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.131:6443: connect: connection refused Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.657726 4722 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.657766 4722 factory.go:103] Registering Raw factory Oct 02 09:32:24 crc kubenswrapper[4722]: E1002 09:32:24.657797 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.131:6443: connect: connection refused" logger="UnhandledError" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.657944 4722 manager.go:1196] Started watching for new ooms in manager Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.659095 4722 server.go:460] "Adding debug handlers to kubelet server" Oct 02 09:32:24 crc kubenswrapper[4722]: E1002 09:32:24.657850 4722 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.131:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186aa2bec34f02d5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-02 09:32:24.645821141 +0000 UTC m=+0.682574121,LastTimestamp:2025-10-02 09:32:24.645821141 +0000 UTC m=+0.682574121,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.667718 4722 manager.go:319] Starting recovery of all containers Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.669770 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.669905 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.669986 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.670053 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.670117 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.670192 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.670261 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.670346 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.673523 4722 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.673636 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.673710 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.673777 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.673836 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.673906 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.673975 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.674039 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.674102 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.674160 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.674250 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.674307 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.674406 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.674484 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.674565 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.674645 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.674720 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.674808 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.674898 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.675022 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.675118 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.675208 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.675296 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.675408 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.675485 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.675566 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.675650 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.675737 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.675856 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.675946 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.676028 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.676116 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.676212 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.676330 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.676428 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.676505 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.676580 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.676653 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.676715 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.676795 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.676859 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.676921 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.676987 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.677045 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.677104 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.677170 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.677233 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.677294 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.677379 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.677454 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.677520 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.677576 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.677635 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.677693 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.677749 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.677809 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.677865 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.677922 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.677983 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.678040 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.678102 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.678164 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.678221 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.678286 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.678409 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.678485 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.678552 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.678617 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.678679 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.678743 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.678845 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.678943 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.679019 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.679092 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.679168 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.679247 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.679343 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.679434 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.679506 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.679622 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.679689 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.679749 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.679840 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.679900 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.679962 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.680063 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.680128 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.680189 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.680247 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.680303 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.680401 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.680466 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.680527 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.680586 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.680647 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.680705 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.680765 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.680829 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.680895 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.680957 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.681022 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.681084 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.681148 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.681218 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.681277 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.681361 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.681426 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.681492 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.681553 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.681610 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.681666 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.681728 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.681788 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.681848 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.681911 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.681974 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.682032 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.682090 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.682152 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.682218 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.682281 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.682375 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.682440 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.682500 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.682566 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.682628 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.682688 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.682767 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.682851 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.682935 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.683025 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.683104 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.683187 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.683256 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.683424 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.683513 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.683584 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.683657 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.683724 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.683809 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.683887 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.683965 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.685178 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.685498 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.685618 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.685853 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.685943 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.686028 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.686113 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.686197 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.686282 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.686413 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.686496 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.686559 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.686645 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.686734 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.686819 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.686894 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.686977 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.687058 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.687158 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.687393 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.687521 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.687754 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.687986 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.688104 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.688194 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.688279 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.688416 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.688492 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.688562 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.688633 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.688708 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.684419 4722 manager.go:324] Recovery completed Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.688773 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.688863 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.688894 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.688911 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.688927 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.688944 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.688960 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.688975 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.688991 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.689007 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.689021 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.689035 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.689049 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.689064 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.689078 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.689091 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.689104 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.689119 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.689135 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.689149 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.689164 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.689180 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.689217 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.689233 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.689247 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.689263 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.689277 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.689291 4722 reconstruct.go:97] "Volume reconstruction finished" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.689301 4722 reconciler.go:26] "Reconciler: start to sync state" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.697576 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.701712 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.701751 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.701761 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.703014 4722 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.703040 4722 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.703068 4722 state_mem.go:36] "Initialized new in-memory state store" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.706248 4722 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.708939 4722 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.708990 4722 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.709024 4722 kubelet.go:2335] "Starting kubelet main sync loop" Oct 02 09:32:24 crc kubenswrapper[4722]: E1002 09:32:24.709239 4722 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 02 09:32:24 crc kubenswrapper[4722]: W1002 09:32:24.711944 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.131:6443: connect: connection refused Oct 02 09:32:24 crc kubenswrapper[4722]: E1002 09:32:24.712024 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.131:6443: connect: connection refused" logger="UnhandledError" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.723652 4722 policy_none.go:49] "None policy: Start" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.724570 4722 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.724600 4722 state_mem.go:35] "Initializing new in-memory state store" Oct 02 09:32:24 crc kubenswrapper[4722]: E1002 09:32:24.754365 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.766989 4722 manager.go:334] "Starting Device Plugin manager" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.767037 4722 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.767050 4722 server.go:79] "Starting device plugin registration server" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.767564 4722 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.767584 4722 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.767704 4722 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.767878 4722 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.767894 4722 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 02 09:32:24 crc kubenswrapper[4722]: E1002 09:32:24.776354 4722 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.810261 4722 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.810424 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.811802 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.811853 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.811866 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.812091 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.812281 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.812364 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.813244 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.813298 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.813327 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.813497 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.813745 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.813822 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.814910 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.814942 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.814953 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.816413 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.816467 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.816470 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.816482 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.816489 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.816502 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.816730 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.816822 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.816861 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.819157 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.819210 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.819230 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.819806 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.819857 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.819869 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.820091 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.820234 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.820284 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.821063 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.821073 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.821084 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.821093 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.821094 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.821109 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.821298 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.821356 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.822080 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.822110 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.822122 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:24 crc kubenswrapper[4722]: E1002 09:32:24.856404 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.131:6443: connect: connection refused" interval="400ms" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.868398 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.869906 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.869944 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.869955 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.869981 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 09:32:24 crc kubenswrapper[4722]: E1002 09:32:24.870563 4722 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.131:6443: connect: connection refused" node="crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.891529 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.891588 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.891620 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.891671 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.891689 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.891708 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.891728 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.891789 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.891859 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.891887 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.891952 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.891975 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.892030 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.892050 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.892081 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.993378 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.993462 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.993687 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.993620 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.993554 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.993768 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.993933 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.993989 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.994036 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.994081 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.994088 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.994130 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.994179 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.994205 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.994224 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.994240 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.994262 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.994278 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.994281 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.994293 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.994312 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.994277 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.994349 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.994340 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.994262 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.994405 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.994412 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.995111 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.995177 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 09:32:24 crc kubenswrapper[4722]: I1002 09:32:24.994582 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 09:32:25 crc kubenswrapper[4722]: I1002 09:32:25.070977 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:25 crc kubenswrapper[4722]: I1002 09:32:25.073112 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:25 crc kubenswrapper[4722]: I1002 09:32:25.073144 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:25 crc kubenswrapper[4722]: I1002 09:32:25.073168 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:25 crc kubenswrapper[4722]: I1002 09:32:25.073194 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 09:32:25 crc kubenswrapper[4722]: E1002 09:32:25.073699 4722 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.131:6443: connect: connection refused" node="crc" Oct 02 09:32:25 crc kubenswrapper[4722]: I1002 09:32:25.150249 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 09:32:25 crc kubenswrapper[4722]: I1002 09:32:25.156028 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 09:32:25 crc kubenswrapper[4722]: I1002 09:32:25.174283 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 02 09:32:25 crc kubenswrapper[4722]: I1002 09:32:25.192336 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 09:32:25 crc kubenswrapper[4722]: W1002 09:32:25.195924 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-2ccdca7cddacef267608bb4bd070e03b3cd92e43288cff50e4e4f59dd15400c8 WatchSource:0}: Error finding container 2ccdca7cddacef267608bb4bd070e03b3cd92e43288cff50e4e4f59dd15400c8: Status 404 returned error can't find the container with id 2ccdca7cddacef267608bb4bd070e03b3cd92e43288cff50e4e4f59dd15400c8 Oct 02 09:32:25 crc kubenswrapper[4722]: I1002 09:32:25.197278 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 09:32:25 crc kubenswrapper[4722]: W1002 09:32:25.200763 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-1eae077c5137316440c6b86c82e7f9b95f03d9eb091ccb0797135c96eb71a92f WatchSource:0}: Error finding container 1eae077c5137316440c6b86c82e7f9b95f03d9eb091ccb0797135c96eb71a92f: Status 404 returned error can't find the container with id 1eae077c5137316440c6b86c82e7f9b95f03d9eb091ccb0797135c96eb71a92f Oct 02 09:32:25 crc kubenswrapper[4722]: W1002 09:32:25.208346 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-dacbf1e8958b9fbc934b4f4760227fe71b087a4d210bcfff7f9e25c439863a63 WatchSource:0}: Error finding container dacbf1e8958b9fbc934b4f4760227fe71b087a4d210bcfff7f9e25c439863a63: Status 404 returned error can't find the container with id dacbf1e8958b9fbc934b4f4760227fe71b087a4d210bcfff7f9e25c439863a63 Oct 02 09:32:25 crc kubenswrapper[4722]: W1002 09:32:25.209951 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-6899420eecf584cb317a3d7e9804c46a5025559a73cdf3cf6bcda21ff09ad1b8 WatchSource:0}: Error finding container 6899420eecf584cb317a3d7e9804c46a5025559a73cdf3cf6bcda21ff09ad1b8: Status 404 returned error can't find the container with id 6899420eecf584cb317a3d7e9804c46a5025559a73cdf3cf6bcda21ff09ad1b8 Oct 02 09:32:25 crc kubenswrapper[4722]: W1002 09:32:25.211114 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-422183cffee3d17d656cc9c4e12ad79d362d7be085c928717feff5a295d4082c WatchSource:0}: Error finding container 422183cffee3d17d656cc9c4e12ad79d362d7be085c928717feff5a295d4082c: Status 404 returned error can't find the container with id 422183cffee3d17d656cc9c4e12ad79d362d7be085c928717feff5a295d4082c Oct 02 09:32:25 crc kubenswrapper[4722]: E1002 09:32:25.258068 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.131:6443: connect: connection refused" interval="800ms" Oct 02 09:32:25 crc kubenswrapper[4722]: I1002 09:32:25.474160 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:25 crc kubenswrapper[4722]: I1002 09:32:25.475445 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:25 crc kubenswrapper[4722]: I1002 09:32:25.475477 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:25 crc kubenswrapper[4722]: I1002 09:32:25.475521 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:25 crc kubenswrapper[4722]: I1002 09:32:25.475548 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 09:32:25 crc kubenswrapper[4722]: E1002 09:32:25.475964 4722 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.131:6443: connect: connection refused" node="crc" Oct 02 09:32:25 crc kubenswrapper[4722]: I1002 09:32:25.649585 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.131:6443: connect: connection refused Oct 02 09:32:25 crc kubenswrapper[4722]: I1002 09:32:25.717530 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"1eae077c5137316440c6b86c82e7f9b95f03d9eb091ccb0797135c96eb71a92f"} Oct 02 09:32:25 crc kubenswrapper[4722]: W1002 09:32:25.719993 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.131:6443: connect: connection refused Oct 02 09:32:25 crc kubenswrapper[4722]: I1002 09:32:25.720087 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"422183cffee3d17d656cc9c4e12ad79d362d7be085c928717feff5a295d4082c"} Oct 02 09:32:25 crc kubenswrapper[4722]: E1002 09:32:25.720083 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.131:6443: connect: connection refused" logger="UnhandledError" Oct 02 09:32:25 crc kubenswrapper[4722]: I1002 09:32:25.721634 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6899420eecf584cb317a3d7e9804c46a5025559a73cdf3cf6bcda21ff09ad1b8"} Oct 02 09:32:25 crc kubenswrapper[4722]: I1002 09:32:25.722993 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"dacbf1e8958b9fbc934b4f4760227fe71b087a4d210bcfff7f9e25c439863a63"} Oct 02 09:32:25 crc kubenswrapper[4722]: I1002 09:32:25.724107 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"2ccdca7cddacef267608bb4bd070e03b3cd92e43288cff50e4e4f59dd15400c8"} Oct 02 09:32:25 crc kubenswrapper[4722]: W1002 09:32:25.793295 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.131:6443: connect: connection refused Oct 02 09:32:25 crc kubenswrapper[4722]: E1002 09:32:25.793447 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.131:6443: connect: connection refused" logger="UnhandledError" Oct 02 09:32:25 crc kubenswrapper[4722]: W1002 09:32:25.984487 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.131:6443: connect: connection refused Oct 02 09:32:25 crc kubenswrapper[4722]: E1002 09:32:25.984603 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.131:6443: connect: connection refused" logger="UnhandledError" Oct 02 09:32:26 crc kubenswrapper[4722]: E1002 09:32:26.059744 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.131:6443: connect: connection refused" interval="1.6s" Oct 02 09:32:26 crc kubenswrapper[4722]: W1002 09:32:26.178592 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.131:6443: connect: connection refused Oct 02 09:32:26 crc kubenswrapper[4722]: E1002 09:32:26.178814 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.131:6443: connect: connection refused" logger="UnhandledError" Oct 02 09:32:26 crc kubenswrapper[4722]: I1002 09:32:26.276595 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:26 crc kubenswrapper[4722]: I1002 09:32:26.279346 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:26 crc kubenswrapper[4722]: I1002 09:32:26.279394 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:26 crc kubenswrapper[4722]: I1002 09:32:26.279409 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:26 crc kubenswrapper[4722]: I1002 09:32:26.279440 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 09:32:26 crc kubenswrapper[4722]: E1002 09:32:26.279937 4722 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.131:6443: connect: connection refused" node="crc" Oct 02 09:32:26 crc kubenswrapper[4722]: I1002 09:32:26.649218 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.131:6443: connect: connection refused Oct 02 09:32:26 crc kubenswrapper[4722]: I1002 09:32:26.729483 4722 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="16af577303472240f6be1eb0175cae33d92df4e7965dcc21d449e5426cde3cc9" exitCode=0 Oct 02 09:32:26 crc kubenswrapper[4722]: I1002 09:32:26.729588 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"16af577303472240f6be1eb0175cae33d92df4e7965dcc21d449e5426cde3cc9"} Oct 02 09:32:26 crc kubenswrapper[4722]: I1002 09:32:26.729705 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:26 crc kubenswrapper[4722]: I1002 09:32:26.731510 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:26 crc kubenswrapper[4722]: I1002 09:32:26.731567 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:26 crc kubenswrapper[4722]: I1002 09:32:26.731588 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:26 crc kubenswrapper[4722]: I1002 09:32:26.732762 4722 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="1be9f4d5574e381c95d2f7e44f20e96db3e586f66a380f79ab89422112bbfb35" exitCode=0 Oct 02 09:32:26 crc kubenswrapper[4722]: I1002 09:32:26.732850 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:26 crc kubenswrapper[4722]: I1002 09:32:26.732894 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"1be9f4d5574e381c95d2f7e44f20e96db3e586f66a380f79ab89422112bbfb35"} Oct 02 09:32:26 crc kubenswrapper[4722]: I1002 09:32:26.733917 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:26 crc kubenswrapper[4722]: I1002 09:32:26.733961 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:26 crc kubenswrapper[4722]: I1002 09:32:26.733998 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:26 crc kubenswrapper[4722]: I1002 09:32:26.735427 4722 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="5a875abefffed9954368ae8e08d3101ae90708e53f82e7eaa447d6164e41a446" exitCode=0 Oct 02 09:32:26 crc kubenswrapper[4722]: I1002 09:32:26.735510 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"5a875abefffed9954368ae8e08d3101ae90708e53f82e7eaa447d6164e41a446"} Oct 02 09:32:26 crc kubenswrapper[4722]: I1002 09:32:26.735566 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:26 crc kubenswrapper[4722]: I1002 09:32:26.737070 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:26 crc kubenswrapper[4722]: I1002 09:32:26.737103 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:26 crc kubenswrapper[4722]: I1002 09:32:26.737117 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:26 crc kubenswrapper[4722]: I1002 09:32:26.738498 4722 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182" exitCode=0 Oct 02 09:32:26 crc kubenswrapper[4722]: I1002 09:32:26.738558 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182"} Oct 02 09:32:26 crc kubenswrapper[4722]: I1002 09:32:26.738675 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:26 crc kubenswrapper[4722]: I1002 09:32:26.739991 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:26 crc kubenswrapper[4722]: I1002 09:32:26.740026 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:26 crc kubenswrapper[4722]: I1002 09:32:26.740045 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:26 crc kubenswrapper[4722]: I1002 09:32:26.741857 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:26 crc kubenswrapper[4722]: I1002 09:32:26.742388 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"546efffc2572c644b7c949abadbaa2f8c9cc0449e08bb5615d3de0504f1bb9e5"} Oct 02 09:32:26 crc kubenswrapper[4722]: I1002 09:32:26.742419 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e14f04556a2e11720ac3e92ec6f1c436468001e87b2984d007f9510a0e6ff0c7"} Oct 02 09:32:26 crc kubenswrapper[4722]: I1002 09:32:26.742434 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f53bf821bcfda8c6b7f9823ecb4a9dc7844eae7b1005d41e7bfe9b47df5d66b5"} Oct 02 09:32:26 crc kubenswrapper[4722]: I1002 09:32:26.742446 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"664f4e51f677aedd16cbcaed322e25d228c87936e30996f36dd3a5b97a43def5"} Oct 02 09:32:26 crc kubenswrapper[4722]: I1002 09:32:26.742989 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:26 crc kubenswrapper[4722]: I1002 09:32:26.743023 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:26 crc kubenswrapper[4722]: I1002 09:32:26.743034 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:27 crc kubenswrapper[4722]: W1002 09:32:27.619686 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.131:6443: connect: connection refused Oct 02 09:32:27 crc kubenswrapper[4722]: E1002 09:32:27.619799 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.131:6443: connect: connection refused" logger="UnhandledError" Oct 02 09:32:27 crc kubenswrapper[4722]: I1002 09:32:27.649460 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.131:6443: connect: connection refused Oct 02 09:32:27 crc kubenswrapper[4722]: E1002 09:32:27.661402 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.131:6443: connect: connection refused" interval="3.2s" Oct 02 09:32:27 crc kubenswrapper[4722]: I1002 09:32:27.746515 4722 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="6e52fd01cec3feb2e23116be0ee802fff889e44f5b835b09ce15cce2d607d2c3" exitCode=0 Oct 02 09:32:27 crc kubenswrapper[4722]: I1002 09:32:27.746626 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"6e52fd01cec3feb2e23116be0ee802fff889e44f5b835b09ce15cce2d607d2c3"} Oct 02 09:32:27 crc kubenswrapper[4722]: I1002 09:32:27.746692 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:27 crc kubenswrapper[4722]: I1002 09:32:27.748334 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:27 crc kubenswrapper[4722]: I1002 09:32:27.748387 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:27 crc kubenswrapper[4722]: I1002 09:32:27.748399 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:27 crc kubenswrapper[4722]: I1002 09:32:27.752031 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:27 crc kubenswrapper[4722]: I1002 09:32:27.752029 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"beaa5e380cbd961e9e06c7aad0f3ff2f42c43eab5e89b6cf779003c490baf018"} Oct 02 09:32:27 crc kubenswrapper[4722]: I1002 09:32:27.752864 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:27 crc kubenswrapper[4722]: I1002 09:32:27.752897 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:27 crc kubenswrapper[4722]: I1002 09:32:27.752908 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:27 crc kubenswrapper[4722]: I1002 09:32:27.753986 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"247e21bc25ecb430b611fbc0d7260465b80365a76340c4d89652a0e0affdc39a"} Oct 02 09:32:27 crc kubenswrapper[4722]: I1002 09:32:27.754017 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"4650b87cfdd4d4075b3c3c3a7417d8ae6c01222820fa783143fcf963904a9215"} Oct 02 09:32:27 crc kubenswrapper[4722]: I1002 09:32:27.755464 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"737b3f37862e455f181b6864cab54bb6dae5c458ed9dcb9b7fda63a06d0f17fa"} Oct 02 09:32:27 crc kubenswrapper[4722]: I1002 09:32:27.755496 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:27 crc kubenswrapper[4722]: I1002 09:32:27.756120 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:27 crc kubenswrapper[4722]: I1002 09:32:27.756149 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:27 crc kubenswrapper[4722]: I1002 09:32:27.756160 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:27 crc kubenswrapper[4722]: W1002 09:32:27.798853 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.131:6443: connect: connection refused Oct 02 09:32:27 crc kubenswrapper[4722]: E1002 09:32:27.798976 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.131:6443: connect: connection refused" logger="UnhandledError" Oct 02 09:32:27 crc kubenswrapper[4722]: I1002 09:32:27.880664 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:27 crc kubenswrapper[4722]: I1002 09:32:27.882242 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:27 crc kubenswrapper[4722]: I1002 09:32:27.882292 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:27 crc kubenswrapper[4722]: I1002 09:32:27.882304 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:27 crc kubenswrapper[4722]: I1002 09:32:27.882358 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 09:32:27 crc kubenswrapper[4722]: E1002 09:32:27.882951 4722 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.131:6443: connect: connection refused" node="crc" Oct 02 09:32:28 crc kubenswrapper[4722]: W1002 09:32:28.010400 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.131:6443: connect: connection refused Oct 02 09:32:28 crc kubenswrapper[4722]: E1002 09:32:28.010515 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.131:6443: connect: connection refused" logger="UnhandledError" Oct 02 09:32:28 crc kubenswrapper[4722]: W1002 09:32:28.322652 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.131:6443: connect: connection refused Oct 02 09:32:28 crc kubenswrapper[4722]: E1002 09:32:28.322755 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.131:6443: connect: connection refused" logger="UnhandledError" Oct 02 09:32:28 crc kubenswrapper[4722]: I1002 09:32:28.649740 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.131:6443: connect: connection refused Oct 02 09:32:28 crc kubenswrapper[4722]: I1002 09:32:28.761684 4722 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="8cb6a1648af34bd7f97bbe60686c2d57c6b8e9ea5b15aba7ca25460565febbd0" exitCode=0 Oct 02 09:32:28 crc kubenswrapper[4722]: I1002 09:32:28.761744 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"8cb6a1648af34bd7f97bbe60686c2d57c6b8e9ea5b15aba7ca25460565febbd0"} Oct 02 09:32:28 crc kubenswrapper[4722]: I1002 09:32:28.762007 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:28 crc kubenswrapper[4722]: I1002 09:32:28.763492 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:28 crc kubenswrapper[4722]: I1002 09:32:28.763530 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:28 crc kubenswrapper[4722]: I1002 09:32:28.763576 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:28 crc kubenswrapper[4722]: I1002 09:32:28.765146 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"09276f774f0a48df9df5b99cce34366b7113c49a0c9e8e269996127d93c6015d"} Oct 02 09:32:28 crc kubenswrapper[4722]: I1002 09:32:28.765253 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:28 crc kubenswrapper[4722]: I1002 09:32:28.769276 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:28 crc kubenswrapper[4722]: I1002 09:32:28.769392 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:28 crc kubenswrapper[4722]: I1002 09:32:28.769547 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:28 crc kubenswrapper[4722]: I1002 09:32:28.774290 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:28 crc kubenswrapper[4722]: I1002 09:32:28.774713 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e3b1c06b43a4fb138588fe71ad3f37dae41d5de647684ab490a2f4079018ac06"} Oct 02 09:32:28 crc kubenswrapper[4722]: I1002 09:32:28.774828 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4bb6cb31ce28cc0d6095e7f27187bdeb1c2ee2d3f19ac480c95ab8116bd776bd"} Oct 02 09:32:28 crc kubenswrapper[4722]: I1002 09:32:28.774914 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d01dd91984bea279379c038d2fbb4a4ad45200b49443071e290c0bf6854af1ed"} Oct 02 09:32:28 crc kubenswrapper[4722]: I1002 09:32:28.775265 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:28 crc kubenswrapper[4722]: I1002 09:32:28.775297 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:28 crc kubenswrapper[4722]: I1002 09:32:28.775328 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:29 crc kubenswrapper[4722]: I1002 09:32:29.290275 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 09:32:29 crc kubenswrapper[4722]: I1002 09:32:29.290444 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:29 crc kubenswrapper[4722]: I1002 09:32:29.291413 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:29 crc kubenswrapper[4722]: I1002 09:32:29.291436 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:29 crc kubenswrapper[4722]: I1002 09:32:29.291445 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:29 crc kubenswrapper[4722]: I1002 09:32:29.651369 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.131:6443: connect: connection refused Oct 02 09:32:29 crc kubenswrapper[4722]: I1002 09:32:29.783004 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"df2d4d186bea7154dd49391b5a32f6dae60214d08a4f0efc3ba770b950d602f6"} Oct 02 09:32:29 crc kubenswrapper[4722]: I1002 09:32:29.783066 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"30003d3713199085ff1a79d8ea9e66d2d6f787689e072f639d611c6ccf41f8ac"} Oct 02 09:32:29 crc kubenswrapper[4722]: I1002 09:32:29.783079 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"87be6d5b772e600788aafb9be35368a8ca62bbb1d2553d4ac9bf8ab684c107c2"} Oct 02 09:32:29 crc kubenswrapper[4722]: I1002 09:32:29.783090 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3a9de4346b8278b90978ca17be9cb2d7141d076585875086061cd940b70b9842"} Oct 02 09:32:29 crc kubenswrapper[4722]: I1002 09:32:29.787342 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"36712afef53cdf53640ea04eba26a04b3a118e89df534fdafdf184d12f171d6a"} Oct 02 09:32:29 crc kubenswrapper[4722]: I1002 09:32:29.787403 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:29 crc kubenswrapper[4722]: I1002 09:32:29.787543 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:29 crc kubenswrapper[4722]: I1002 09:32:29.787543 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 09:32:29 crc kubenswrapper[4722]: I1002 09:32:29.788390 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:29 crc kubenswrapper[4722]: I1002 09:32:29.788446 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:29 crc kubenswrapper[4722]: I1002 09:32:29.788464 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:29 crc kubenswrapper[4722]: I1002 09:32:29.789079 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:29 crc kubenswrapper[4722]: I1002 09:32:29.789116 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:29 crc kubenswrapper[4722]: I1002 09:32:29.789126 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:30 crc kubenswrapper[4722]: I1002 09:32:30.155367 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 09:32:30 crc kubenswrapper[4722]: I1002 09:32:30.155587 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:30 crc kubenswrapper[4722]: I1002 09:32:30.157074 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:30 crc kubenswrapper[4722]: I1002 09:32:30.157133 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:30 crc kubenswrapper[4722]: I1002 09:32:30.157146 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:30 crc kubenswrapper[4722]: I1002 09:32:30.173419 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 09:32:30 crc kubenswrapper[4722]: I1002 09:32:30.785980 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 09:32:30 crc kubenswrapper[4722]: I1002 09:32:30.796759 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 09:32:30 crc kubenswrapper[4722]: I1002 09:32:30.796824 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:30 crc kubenswrapper[4722]: I1002 09:32:30.797320 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ae5e7b6ece8ad6107bea24b258aa605bdedb34601326885a23c7d26e9c85e0be"} Oct 02 09:32:30 crc kubenswrapper[4722]: I1002 09:32:30.797412 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:30 crc kubenswrapper[4722]: I1002 09:32:30.797567 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:30 crc kubenswrapper[4722]: I1002 09:32:30.797636 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:30 crc kubenswrapper[4722]: I1002 09:32:30.797984 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:30 crc kubenswrapper[4722]: I1002 09:32:30.798030 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:30 crc kubenswrapper[4722]: I1002 09:32:30.798042 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:30 crc kubenswrapper[4722]: I1002 09:32:30.800172 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:30 crc kubenswrapper[4722]: I1002 09:32:30.800211 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:30 crc kubenswrapper[4722]: I1002 09:32:30.800237 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:30 crc kubenswrapper[4722]: I1002 09:32:30.803182 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:30 crc kubenswrapper[4722]: I1002 09:32:30.803206 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:30 crc kubenswrapper[4722]: I1002 09:32:30.803232 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:30 crc kubenswrapper[4722]: I1002 09:32:30.803247 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:30 crc kubenswrapper[4722]: I1002 09:32:30.803249 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:30 crc kubenswrapper[4722]: I1002 09:32:30.803266 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:31 crc kubenswrapper[4722]: I1002 09:32:31.083790 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:31 crc kubenswrapper[4722]: I1002 09:32:31.085563 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:31 crc kubenswrapper[4722]: I1002 09:32:31.085620 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:31 crc kubenswrapper[4722]: I1002 09:32:31.085630 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:31 crc kubenswrapper[4722]: I1002 09:32:31.085660 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 09:32:31 crc kubenswrapper[4722]: I1002 09:32:31.542161 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 09:32:31 crc kubenswrapper[4722]: I1002 09:32:31.641523 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 02 09:32:31 crc kubenswrapper[4722]: I1002 09:32:31.799761 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 09:32:31 crc kubenswrapper[4722]: I1002 09:32:31.799784 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:31 crc kubenswrapper[4722]: I1002 09:32:31.799809 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:31 crc kubenswrapper[4722]: I1002 09:32:31.799811 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:31 crc kubenswrapper[4722]: I1002 09:32:31.800976 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:31 crc kubenswrapper[4722]: I1002 09:32:31.800996 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:31 crc kubenswrapper[4722]: I1002 09:32:31.801007 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:31 crc kubenswrapper[4722]: I1002 09:32:31.801015 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:31 crc kubenswrapper[4722]: I1002 09:32:31.801027 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:31 crc kubenswrapper[4722]: I1002 09:32:31.801019 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:31 crc kubenswrapper[4722]: I1002 09:32:31.800990 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:31 crc kubenswrapper[4722]: I1002 09:32:31.801300 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:31 crc kubenswrapper[4722]: I1002 09:32:31.801392 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:32 crc kubenswrapper[4722]: I1002 09:32:32.290506 4722 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 02 09:32:32 crc kubenswrapper[4722]: I1002 09:32:32.290691 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 02 09:32:32 crc kubenswrapper[4722]: I1002 09:32:32.801858 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:32 crc kubenswrapper[4722]: I1002 09:32:32.803910 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:32 crc kubenswrapper[4722]: I1002 09:32:32.803954 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:32 crc kubenswrapper[4722]: I1002 09:32:32.803969 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:33 crc kubenswrapper[4722]: I1002 09:32:33.052400 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 09:32:33 crc kubenswrapper[4722]: I1002 09:32:33.052705 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:33 crc kubenswrapper[4722]: I1002 09:32:33.054396 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:33 crc kubenswrapper[4722]: I1002 09:32:33.054467 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:33 crc kubenswrapper[4722]: I1002 09:32:33.054565 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:33 crc kubenswrapper[4722]: I1002 09:32:33.749633 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 02 09:32:33 crc kubenswrapper[4722]: I1002 09:32:33.804558 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:33 crc kubenswrapper[4722]: I1002 09:32:33.805727 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:33 crc kubenswrapper[4722]: I1002 09:32:33.805818 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:33 crc kubenswrapper[4722]: I1002 09:32:33.805857 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:34 crc kubenswrapper[4722]: I1002 09:32:34.306180 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 09:32:34 crc kubenswrapper[4722]: I1002 09:32:34.306629 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:34 crc kubenswrapper[4722]: I1002 09:32:34.308481 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:34 crc kubenswrapper[4722]: I1002 09:32:34.308551 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:34 crc kubenswrapper[4722]: I1002 09:32:34.308572 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:34 crc kubenswrapper[4722]: E1002 09:32:34.776551 4722 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 02 09:32:34 crc kubenswrapper[4722]: I1002 09:32:34.856656 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 09:32:34 crc kubenswrapper[4722]: I1002 09:32:34.856853 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:34 crc kubenswrapper[4722]: I1002 09:32:34.858104 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:34 crc kubenswrapper[4722]: I1002 09:32:34.858134 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:34 crc kubenswrapper[4722]: I1002 09:32:34.858144 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:40 crc kubenswrapper[4722]: I1002 09:32:40.650460 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Oct 02 09:32:40 crc kubenswrapper[4722]: I1002 09:32:40.827829 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 02 09:32:40 crc kubenswrapper[4722]: I1002 09:32:40.829993 4722 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="36712afef53cdf53640ea04eba26a04b3a118e89df534fdafdf184d12f171d6a" exitCode=255 Oct 02 09:32:40 crc kubenswrapper[4722]: I1002 09:32:40.830037 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"36712afef53cdf53640ea04eba26a04b3a118e89df534fdafdf184d12f171d6a"} Oct 02 09:32:40 crc kubenswrapper[4722]: I1002 09:32:40.830210 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:40 crc kubenswrapper[4722]: I1002 09:32:40.831137 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:40 crc kubenswrapper[4722]: I1002 09:32:40.831172 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:40 crc kubenswrapper[4722]: I1002 09:32:40.831183 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:40 crc kubenswrapper[4722]: I1002 09:32:40.831666 4722 scope.go:117] "RemoveContainer" containerID="36712afef53cdf53640ea04eba26a04b3a118e89df534fdafdf184d12f171d6a" Oct 02 09:32:40 crc kubenswrapper[4722]: E1002 09:32:40.863512 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="6.4s" Oct 02 09:32:41 crc kubenswrapper[4722]: I1002 09:32:41.044864 4722 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 02 09:32:41 crc kubenswrapper[4722]: I1002 09:32:41.044932 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 02 09:32:41 crc kubenswrapper[4722]: I1002 09:32:41.049797 4722 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 02 09:32:41 crc kubenswrapper[4722]: I1002 09:32:41.049927 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 02 09:32:41 crc kubenswrapper[4722]: I1002 09:32:41.546759 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 09:32:41 crc kubenswrapper[4722]: I1002 09:32:41.546927 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:41 crc kubenswrapper[4722]: I1002 09:32:41.548366 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:41 crc kubenswrapper[4722]: I1002 09:32:41.548429 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:41 crc kubenswrapper[4722]: I1002 09:32:41.548442 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:41 crc kubenswrapper[4722]: I1002 09:32:41.669200 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 02 09:32:41 crc kubenswrapper[4722]: I1002 09:32:41.669480 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:41 crc kubenswrapper[4722]: I1002 09:32:41.670756 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:41 crc kubenswrapper[4722]: I1002 09:32:41.670797 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:41 crc kubenswrapper[4722]: I1002 09:32:41.670811 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:41 crc kubenswrapper[4722]: I1002 09:32:41.705661 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 02 09:32:41 crc kubenswrapper[4722]: I1002 09:32:41.835021 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 02 09:32:41 crc kubenswrapper[4722]: I1002 09:32:41.836773 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02"} Oct 02 09:32:41 crc kubenswrapper[4722]: I1002 09:32:41.836865 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:41 crc kubenswrapper[4722]: I1002 09:32:41.836906 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:41 crc kubenswrapper[4722]: I1002 09:32:41.837854 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:41 crc kubenswrapper[4722]: I1002 09:32:41.837866 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:41 crc kubenswrapper[4722]: I1002 09:32:41.837899 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:41 crc kubenswrapper[4722]: I1002 09:32:41.837913 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:41 crc kubenswrapper[4722]: I1002 09:32:41.837918 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:41 crc kubenswrapper[4722]: I1002 09:32:41.837938 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:42 crc kubenswrapper[4722]: I1002 09:32:42.291083 4722 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 02 09:32:42 crc kubenswrapper[4722]: I1002 09:32:42.291203 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 02 09:32:43 crc kubenswrapper[4722]: I1002 09:32:43.052935 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 09:32:43 crc kubenswrapper[4722]: I1002 09:32:43.053139 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:43 crc kubenswrapper[4722]: I1002 09:32:43.054536 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:43 crc kubenswrapper[4722]: I1002 09:32:43.054570 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:43 crc kubenswrapper[4722]: I1002 09:32:43.054579 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:44 crc kubenswrapper[4722]: I1002 09:32:44.313420 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 09:32:44 crc kubenswrapper[4722]: I1002 09:32:44.313734 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:44 crc kubenswrapper[4722]: I1002 09:32:44.315592 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:44 crc kubenswrapper[4722]: I1002 09:32:44.315704 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:44 crc kubenswrapper[4722]: I1002 09:32:44.315733 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:44 crc kubenswrapper[4722]: I1002 09:32:44.317489 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 09:32:44 crc kubenswrapper[4722]: E1002 09:32:44.776663 4722 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 02 09:32:44 crc kubenswrapper[4722]: I1002 09:32:44.846185 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:44 crc kubenswrapper[4722]: I1002 09:32:44.847590 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:44 crc kubenswrapper[4722]: I1002 09:32:44.847643 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:44 crc kubenswrapper[4722]: I1002 09:32:44.847653 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.037035 4722 trace.go:236] Trace[120250299]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Oct-2025 09:32:33.555) (total time: 12481ms): Oct 02 09:32:46 crc kubenswrapper[4722]: Trace[120250299]: ---"Objects listed" error: 12481ms (09:32:46.036) Oct 02 09:32:46 crc kubenswrapper[4722]: Trace[120250299]: [12.481296003s] [12.481296003s] END Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.037071 4722 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.038861 4722 trace.go:236] Trace[1759331344]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Oct-2025 09:32:33.651) (total time: 12387ms): Oct 02 09:32:46 crc kubenswrapper[4722]: Trace[1759331344]: ---"Objects listed" error: 12387ms (09:32:46.038) Oct 02 09:32:46 crc kubenswrapper[4722]: Trace[1759331344]: [12.38730934s] [12.38730934s] END Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.038926 4722 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.040079 4722 trace.go:236] Trace[1100268429]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Oct-2025 09:32:33.518) (total time: 12521ms): Oct 02 09:32:46 crc kubenswrapper[4722]: Trace[1100268429]: ---"Objects listed" error: 12521ms (09:32:46.040) Oct 02 09:32:46 crc kubenswrapper[4722]: Trace[1100268429]: [12.521644251s] [12.521644251s] END Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.040103 4722 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.040105 4722 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.040779 4722 trace.go:236] Trace[353190063]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Oct-2025 09:32:32.774) (total time: 13266ms): Oct 02 09:32:46 crc kubenswrapper[4722]: Trace[353190063]: ---"Objects listed" error: 13266ms (09:32:46.040) Oct 02 09:32:46 crc kubenswrapper[4722]: Trace[353190063]: [13.266425546s] [13.266425546s] END Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.040812 4722 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 02 09:32:46 crc kubenswrapper[4722]: E1002 09:32:46.041658 4722 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.646843 4722 apiserver.go:52] "Watching apiserver" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.650734 4722 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.651099 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.651635 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.651798 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.651671 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.651887 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:32:46 crc kubenswrapper[4722]: E1002 09:32:46.651949 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.652100 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.652150 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 09:32:46 crc kubenswrapper[4722]: E1002 09:32:46.652198 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:32:46 crc kubenswrapper[4722]: E1002 09:32:46.652354 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.655161 4722 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.658731 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.658998 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.659108 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.659143 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.659309 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.659351 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.659726 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.660034 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.660275 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.691064 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.705683 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.722832 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.736403 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.744767 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.744825 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.744845 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.744866 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.744883 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.744904 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.744924 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.744940 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.744958 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.744973 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.744987 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.745009 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.745149 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.745172 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.745194 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.745217 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.745239 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.745202 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.745266 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.745403 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.745443 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.745471 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.745498 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.745529 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.745553 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.745581 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.745608 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.745632 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.745659 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.745682 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.745705 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.745730 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.745755 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.745781 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.745806 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.745830 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.745860 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.745888 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.745913 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.745937 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.745961 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.745986 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.746012 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.746035 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.746059 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.746085 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.746113 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.746139 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.746163 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.746187 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.746212 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.746217 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.746239 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.746265 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.746289 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.746331 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.746361 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.746387 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.746412 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.746441 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.746458 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.746465 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.746514 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.746546 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.746570 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.746595 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.746618 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.746640 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.746664 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.746687 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.746710 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.746732 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.746757 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.746781 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.746808 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.746833 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.746855 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.746880 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.746904 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.746927 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.746921 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.746948 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.746971 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.746995 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.747019 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.747040 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.747063 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.747081 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.747088 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.747111 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.747136 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.747156 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.747167 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.747191 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.747216 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.747242 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.747260 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.747269 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.747293 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.747369 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.747412 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.747551 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.747559 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.747580 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.747612 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.747633 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.747656 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.747679 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.747700 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.747743 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.747768 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.747792 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.747822 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.747842 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.747865 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.747886 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.747908 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.747932 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.747954 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.747976 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.747993 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.747999 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748027 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748027 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748052 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748076 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748100 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748114 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748122 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748148 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748171 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748194 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748215 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748241 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748266 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748289 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748316 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748354 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748376 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748398 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748420 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748444 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748469 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748492 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748513 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748538 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748561 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748584 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748606 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748630 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748652 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748677 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748703 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748726 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748748 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748771 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748789 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748807 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748825 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748842 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748860 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748884 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748906 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748925 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748950 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748968 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748986 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749004 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749022 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749039 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749057 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749074 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749093 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749110 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749127 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749144 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749162 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749179 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749214 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749231 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749250 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749268 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749286 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749304 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749357 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749380 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749400 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749421 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749440 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749460 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749480 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749501 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749523 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749541 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749560 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749578 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749596 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749612 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749629 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749647 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749665 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749681 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749699 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749717 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749735 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749754 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749770 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749787 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749802 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749844 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749875 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749894 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749912 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749936 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749954 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749974 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749994 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.750018 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.750043 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.750061 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.750078 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.750095 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.750117 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.750163 4722 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.750174 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.750185 4722 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.750195 4722 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.750209 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.750223 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.750237 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.750250 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.750262 4722 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.750272 4722 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.750282 4722 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.750293 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.750306 4722 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.750339 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.757057 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.759768 4722 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.765490 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.772875 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.774146 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.778531 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748240 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748387 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748613 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.748731 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749043 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749047 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749175 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749330 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749468 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749677 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749765 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.749799 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.750108 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.750371 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.750883 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.751000 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.751206 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.751405 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.751432 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.751519 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.751674 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.751702 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.751800 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.751895 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.752040 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.752138 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.752146 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.752404 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.752472 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.752750 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.752915 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.753228 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.753235 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.753371 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.753701 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.753720 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.753914 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.753957 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.753982 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.754165 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.754207 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.754275 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.754414 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.754405 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.754570 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.754700 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: E1002 09:32:46.754735 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:32:47.254698017 +0000 UTC m=+23.291451037 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.780042 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.780720 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.780990 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.781439 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.781914 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.754801 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.754883 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.755138 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.755218 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.755504 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.755602 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.755855 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.756144 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.756354 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.756586 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.756876 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.757084 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.757507 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.757930 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: E1002 09:32:46.757998 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 09:32:46 crc kubenswrapper[4722]: E1002 09:32:46.782602 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 09:32:47.282335585 +0000 UTC m=+23.319088665 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.782618 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.758847 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.759424 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.760164 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.760788 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.761125 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.761530 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.761872 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.763060 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.763118 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.763271 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.763438 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.763521 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.783029 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.764020 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.764121 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.764361 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.764396 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.764503 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.764617 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.764836 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.764957 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.765054 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.765281 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.765312 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.765644 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.765671 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.766147 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.766356 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.766671 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.766716 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.766805 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.766813 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.766993 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.767224 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.768298 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.768726 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.769030 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.769207 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.769449 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.770277 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.770602 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.770879 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.771065 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.771577 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.771845 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.772033 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.772230 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.772238 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.772419 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.772546 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.772667 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.773300 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: E1002 09:32:46.773481 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.773626 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.776120 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.776565 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.777088 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: E1002 09:32:46.777416 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.777852 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.778121 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.778413 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.778928 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.779094 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.779332 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.779379 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.779390 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.783054 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.783886 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.784104 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.784120 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.784679 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.784942 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: E1002 09:32:46.785153 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 09:32:47.285127498 +0000 UTC m=+23.321880478 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.785194 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: E1002 09:32:46.785227 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 09:32:46 crc kubenswrapper[4722]: E1002 09:32:46.785267 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 09:32:46 crc kubenswrapper[4722]: E1002 09:32:46.785303 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 09:32:47.285296572 +0000 UTC m=+23.322049542 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.785601 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.785765 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.786067 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.786092 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.787048 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.794742 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.797845 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.798298 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.799046 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: E1002 09:32:46.800680 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 09:32:46 crc kubenswrapper[4722]: E1002 09:32:46.800706 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 09:32:46 crc kubenswrapper[4722]: E1002 09:32:46.800719 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 09:32:46 crc kubenswrapper[4722]: E1002 09:32:46.800764 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 09:32:47.300750594 +0000 UTC m=+23.337503574 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.801156 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.801890 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.803700 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.804126 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.805113 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.806577 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.815685 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.824827 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.824958 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.825414 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.826539 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.826687 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.826768 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.828050 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.830281 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.830385 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.830604 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.831178 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.831628 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.832936 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.834484 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.834910 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.839224 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.839215 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.839666 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.841723 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.841897 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.842022 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.842497 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.842663 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.842927 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.842834 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.843009 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.843074 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.843102 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.843169 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.843456 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.845224 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.851642 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.851747 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.851862 4722 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.851884 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.851904 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.851918 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.851930 4722 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.851941 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.851954 4722 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.851965 4722 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.852033 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.852053 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.852066 4722 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.852078 4722 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.852092 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.852105 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.852118 4722 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.852129 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.852171 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.852183 4722 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.852196 4722 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.852206 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.852217 4722 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.852229 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.852240 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.852251 4722 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.852263 4722 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.852274 4722 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.852345 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.852366 4722 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.852377 4722 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.852388 4722 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.852399 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.852411 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.852455 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.852465 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.852609 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.852852 4722 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.852871 4722 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.852883 4722 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.852893 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.852904 4722 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.852923 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.852934 4722 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.852946 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.852957 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.852971 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.853008 4722 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.853061 4722 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.853075 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.853098 4722 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.853110 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.853156 4722 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.853168 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.853207 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.853240 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.853253 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.853263 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.853274 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.853341 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.853359 4722 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.853372 4722 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.853383 4722 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.853395 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.853434 4722 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.853446 4722 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.853477 4722 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.853488 4722 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.853499 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.853512 4722 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.853524 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.853535 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.853545 4722 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.853841 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.853857 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.853885 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.853897 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.853909 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.853919 4722 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.853930 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.853940 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.853952 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.853963 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.853974 4722 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.853984 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.853995 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854006 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854017 4722 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854027 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854038 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854064 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854097 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854110 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854122 4722 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854148 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854159 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854169 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854180 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854190 4722 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854218 4722 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854231 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854276 4722 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854290 4722 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854303 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854215 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854335 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854347 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854358 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854370 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854381 4722 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854417 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854427 4722 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854441 4722 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854492 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854523 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854535 4722 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854545 4722 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854556 4722 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854567 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854578 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854588 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854599 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854629 4722 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854639 4722 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854649 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854659 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854670 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854681 4722 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854697 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854724 4722 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854750 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854887 4722 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854903 4722 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854934 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854948 4722 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854962 4722 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.854976 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.855008 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.855021 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.855034 4722 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.855049 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.855061 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.855072 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.855087 4722 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.855100 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.855110 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.855122 4722 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.855133 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.855072 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.855155 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.855168 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.855179 4722 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.855189 4722 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.855199 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.855210 4722 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.855220 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.855245 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.855257 4722 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.855385 4722 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.855397 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.855411 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.855422 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.855432 4722 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.855443 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.855454 4722 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.855466 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.855477 4722 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.855596 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.855612 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.855623 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.855633 4722 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.855644 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.855697 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.855711 4722 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.855721 4722 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.855745 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.855757 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.855787 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.855800 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.855812 4722 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.855822 4722 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.855833 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.857307 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.857472 4722 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02" exitCode=255 Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.857502 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02"} Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.857597 4722 scope.go:117] "RemoveContainer" containerID="36712afef53cdf53640ea04eba26a04b3a118e89df534fdafdf184d12f171d6a" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.858055 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.858199 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.862546 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.866770 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.871518 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.873184 4722 scope.go:117] "RemoveContainer" containerID="ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02" Oct 02 09:32:46 crc kubenswrapper[4722]: E1002 09:32:46.873457 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.877454 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.883179 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.896248 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.907502 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.916177 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.926366 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.956786 4722 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.956821 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.956837 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.967963 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.975158 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 09:32:46 crc kubenswrapper[4722]: I1002 09:32:46.980779 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 09:32:46 crc kubenswrapper[4722]: W1002 09:32:46.992484 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-aaefc5e5a1b2fa64b60cadfd70b3ac673c45c3fa488e37d588d809790a7e545e WatchSource:0}: Error finding container aaefc5e5a1b2fa64b60cadfd70b3ac673c45c3fa488e37d588d809790a7e545e: Status 404 returned error can't find the container with id aaefc5e5a1b2fa64b60cadfd70b3ac673c45c3fa488e37d588d809790a7e545e Oct 02 09:32:47 crc kubenswrapper[4722]: I1002 09:32:47.260107 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:32:47 crc kubenswrapper[4722]: E1002 09:32:47.260386 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:32:48.260345188 +0000 UTC m=+24.297098168 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:32:47 crc kubenswrapper[4722]: I1002 09:32:47.361034 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:32:47 crc kubenswrapper[4722]: I1002 09:32:47.361081 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:32:47 crc kubenswrapper[4722]: I1002 09:32:47.361100 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:32:47 crc kubenswrapper[4722]: I1002 09:32:47.361118 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:32:47 crc kubenswrapper[4722]: E1002 09:32:47.361213 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 09:32:47 crc kubenswrapper[4722]: E1002 09:32:47.361264 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 09:32:47 crc kubenswrapper[4722]: E1002 09:32:47.361290 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 09:32:47 crc kubenswrapper[4722]: E1002 09:32:47.361305 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 09:32:47 crc kubenswrapper[4722]: E1002 09:32:47.361351 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 09:32:48.361309221 +0000 UTC m=+24.398062201 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 09:32:47 crc kubenswrapper[4722]: E1002 09:32:47.361224 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 09:32:47 crc kubenswrapper[4722]: E1002 09:32:47.361373 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 09:32:48.361364263 +0000 UTC m=+24.398117333 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 09:32:47 crc kubenswrapper[4722]: E1002 09:32:47.361400 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 09:32:48.361389543 +0000 UTC m=+24.398142523 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 09:32:47 crc kubenswrapper[4722]: E1002 09:32:47.361468 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 09:32:47 crc kubenswrapper[4722]: E1002 09:32:47.361529 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 09:32:47 crc kubenswrapper[4722]: E1002 09:32:47.361543 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 09:32:47 crc kubenswrapper[4722]: E1002 09:32:47.361659 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 09:32:48.361616479 +0000 UTC m=+24.398369459 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 09:32:47 crc kubenswrapper[4722]: I1002 09:32:47.606046 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-6hm66"] Oct 02 09:32:47 crc kubenswrapper[4722]: I1002 09:32:47.606513 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6hm66" Oct 02 09:32:47 crc kubenswrapper[4722]: I1002 09:32:47.608966 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 02 09:32:47 crc kubenswrapper[4722]: I1002 09:32:47.609230 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 02 09:32:47 crc kubenswrapper[4722]: I1002 09:32:47.609273 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 02 09:32:47 crc kubenswrapper[4722]: I1002 09:32:47.623155 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 09:32:47 crc kubenswrapper[4722]: I1002 09:32:47.646727 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 09:32:47 crc kubenswrapper[4722]: I1002 09:32:47.663799 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3f7c5a4b-6be1-42da-abd5-039f84d7d6a8-hosts-file\") pod \"node-resolver-6hm66\" (UID: \"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\") " pod="openshift-dns/node-resolver-6hm66" Oct 02 09:32:47 crc kubenswrapper[4722]: I1002 09:32:47.663854 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k768\" (UniqueName: \"kubernetes.io/projected/3f7c5a4b-6be1-42da-abd5-039f84d7d6a8-kube-api-access-7k768\") pod \"node-resolver-6hm66\" (UID: \"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\") " pod="openshift-dns/node-resolver-6hm66" Oct 02 09:32:47 crc kubenswrapper[4722]: I1002 09:32:47.664419 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 09:32:47 crc kubenswrapper[4722]: I1002 09:32:47.676163 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 09:32:47 crc kubenswrapper[4722]: I1002 09:32:47.693218 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a56e3a0-fea1-4ca5-9d2a-b0a28e16d684\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://737b3f37862e455f181b6864cab54bb6dae5c458ed9dcb9b7fda63a06d0f17fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb6cb31ce28cc0d6095e7f27187bdeb1c2ee2d3f19ac480c95ab8116bd776bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01dd91984bea279379c038d2fbb4a4ad45200b49443071e290c0bf6854af1ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36712afef53cdf53640ea04eba26a04b3a118e89df534fdafdf184d12f171d6a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T09:32:40Z\\\",\\\"message\\\":\\\"W1002 09:32:29.420364 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 09:32:29.420797 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759397549 cert, and key in /tmp/serving-cert-2903392625/serving-signer.crt, /tmp/serving-cert-2903392625/serving-signer.key\\\\nI1002 09:32:29.918835 1 observer_polling.go:159] Starting file observer\\\\nW1002 09:32:29.932900 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 09:32:29.933093 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 09:32:29.933849 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2903392625/tls.crt::/tmp/serving-cert-2903392625/tls.key\\\\\\\"\\\\nF1002 09:32:40.211825 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 09:32:46.368533 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 09:32:46.368793 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 09:32:46.369600 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1554788403/tls.crt::/tmp/serving-cert-1554788403/tls.key\\\\\\\"\\\\nI1002 09:32:46.682405 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 09:32:46.685049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 09:32:46.685073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 09:32:46.685132 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 09:32:46.685139 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 09:32:46.690261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 09:32:46.690286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 09:32:46.690299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 09:32:46.690303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 09:32:46.690306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 09:32:46.690478 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 09:32:46.692277 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3b1c06b43a4fb138588fe71ad3f37dae41d5de647684ab490a2f4079018ac06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 09:32:47 crc kubenswrapper[4722]: I1002 09:32:47.705781 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 09:32:47 crc kubenswrapper[4722]: I1002 09:32:47.716085 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 09:32:47 crc kubenswrapper[4722]: I1002 09:32:47.729707 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hm66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7k768\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hm66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 09:32:47 crc kubenswrapper[4722]: I1002 09:32:47.765262 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3f7c5a4b-6be1-42da-abd5-039f84d7d6a8-hosts-file\") pod \"node-resolver-6hm66\" (UID: \"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\") " pod="openshift-dns/node-resolver-6hm66" Oct 02 09:32:47 crc kubenswrapper[4722]: I1002 09:32:47.765338 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7k768\" (UniqueName: \"kubernetes.io/projected/3f7c5a4b-6be1-42da-abd5-039f84d7d6a8-kube-api-access-7k768\") pod \"node-resolver-6hm66\" (UID: \"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\") " pod="openshift-dns/node-resolver-6hm66" Oct 02 09:32:47 crc kubenswrapper[4722]: I1002 09:32:47.765461 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3f7c5a4b-6be1-42da-abd5-039f84d7d6a8-hosts-file\") pod \"node-resolver-6hm66\" (UID: \"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\") " pod="openshift-dns/node-resolver-6hm66" Oct 02 09:32:47 crc kubenswrapper[4722]: I1002 09:32:47.793516 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k768\" (UniqueName: \"kubernetes.io/projected/3f7c5a4b-6be1-42da-abd5-039f84d7d6a8-kube-api-access-7k768\") pod \"node-resolver-6hm66\" (UID: \"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\") " pod="openshift-dns/node-resolver-6hm66" Oct 02 09:32:47 crc kubenswrapper[4722]: I1002 09:32:47.861881 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 02 09:32:47 crc kubenswrapper[4722]: I1002 09:32:47.864045 4722 scope.go:117] "RemoveContainer" containerID="ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02" Oct 02 09:32:47 crc kubenswrapper[4722]: E1002 09:32:47.864208 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 02 09:32:47 crc kubenswrapper[4722]: I1002 09:32:47.864580 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"aaefc5e5a1b2fa64b60cadfd70b3ac673c45c3fa488e37d588d809790a7e545e"} Oct 02 09:32:47 crc kubenswrapper[4722]: I1002 09:32:47.866215 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"562dcc33d7454113e3dec145a8b2faba06d442892ac65301a5713c82e3ca73e3"} Oct 02 09:32:47 crc kubenswrapper[4722]: I1002 09:32:47.866246 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"1fb9bf8924f890266a0e9c473a3c34c9a48fedbbcd49973e07dd0d8771d206f6"} Oct 02 09:32:47 crc kubenswrapper[4722]: I1002 09:32:47.868211 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"059f5c846149b9005d20124d666d2964bccada1c171cb57e17c64555147b0ef7"} Oct 02 09:32:47 crc kubenswrapper[4722]: I1002 09:32:47.868280 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"213ee5e5f4f56ae11d8f714c42b1c1e6044b04fa5cf4c353d6bfd08c9f3b9e9d"} Oct 02 09:32:47 crc kubenswrapper[4722]: I1002 09:32:47.868293 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"bcdbc055102df4ddffe86176d7325443099aa0742e622f919acbc745f70ab6bd"} Oct 02 09:32:47 crc kubenswrapper[4722]: I1002 09:32:47.880654 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:47Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:47 crc kubenswrapper[4722]: I1002 09:32:47.896257 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:47Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:47 crc kubenswrapper[4722]: I1002 09:32:47.911960 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:47Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:47 crc kubenswrapper[4722]: I1002 09:32:47.923152 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6hm66" Oct 02 09:32:47 crc kubenswrapper[4722]: I1002 09:32:47.929184 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:47Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:47 crc kubenswrapper[4722]: I1002 09:32:47.942117 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:47Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:47 crc kubenswrapper[4722]: W1002 09:32:47.961072 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f7c5a4b_6be1_42da_abd5_039f84d7d6a8.slice/crio-8905ac87a37292a46b7c842fe2bea8c1c2c4bf90ce2eaa6de8891ef61ae306be WatchSource:0}: Error finding container 8905ac87a37292a46b7c842fe2bea8c1c2c4bf90ce2eaa6de8891ef61ae306be: Status 404 returned error can't find the container with id 8905ac87a37292a46b7c842fe2bea8c1c2c4bf90ce2eaa6de8891ef61ae306be Oct 02 09:32:47 crc kubenswrapper[4722]: I1002 09:32:47.961888 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:47Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:47 crc kubenswrapper[4722]: I1002 09:32:47.982005 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hm66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7k768\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hm66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:47Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.006144 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a56e3a0-fea1-4ca5-9d2a-b0a28e16d684\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://737b3f37862e455f181b6864cab54bb6dae5c458ed9dcb9b7fda63a06d0f17fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb6cb31ce28cc0d6095e7f27187bdeb1c2ee2d3f19ac480c95ab8116bd776bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01dd91984bea279379c038d2fbb4a4ad45200b49443071e290c0bf6854af1ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 09:32:46.368533 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 09:32:46.368793 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 09:32:46.369600 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1554788403/tls.crt::/tmp/serving-cert-1554788403/tls.key\\\\\\\"\\\\nI1002 09:32:46.682405 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 09:32:46.685049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 09:32:46.685073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 09:32:46.685132 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 09:32:46.685139 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 09:32:46.690261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 09:32:46.690286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 09:32:46.690299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 09:32:46.690303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 09:32:46.690306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 09:32:46.690478 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 09:32:46.692277 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3b1c06b43a4fb138588fe71ad3f37dae41d5de647684ab490a2f4079018ac06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:48Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.023486 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:48Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.041469 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:48Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.049413 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-zwlns"] Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.049838 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: W1002 09:32:48.052813 4722 reflector.go:561] object-"openshift-multus"/"default-dockercfg-2q5b6": failed to list *v1.Secret: secrets "default-dockercfg-2q5b6" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Oct 02 09:32:48 crc kubenswrapper[4722]: E1002 09:32:48.052850 4722 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"default-dockercfg-2q5b6\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"default-dockercfg-2q5b6\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 02 09:32:48 crc kubenswrapper[4722]: W1002 09:32:48.052891 4722 reflector.go:561] object-"openshift-multus"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Oct 02 09:32:48 crc kubenswrapper[4722]: E1002 09:32:48.052955 4722 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 02 09:32:48 crc kubenswrapper[4722]: W1002 09:32:48.053040 4722 reflector.go:561] object-"openshift-multus"/"multus-daemon-config": failed to list *v1.ConfigMap: configmaps "multus-daemon-config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Oct 02 09:32:48 crc kubenswrapper[4722]: E1002 09:32:48.053062 4722 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"multus-daemon-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"multus-daemon-config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.053123 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.053138 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.061078 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://562dcc33d7454113e3dec145a8b2faba06d442892ac65301a5713c82e3ca73e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:48Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.067149 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/04c84986-75a1-4683-8764-8f7307d1126a-multus-cni-dir\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.067201 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/04c84986-75a1-4683-8764-8f7307d1126a-os-release\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.067229 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/04c84986-75a1-4683-8764-8f7307d1126a-multus-socket-dir-parent\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.067254 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/04c84986-75a1-4683-8764-8f7307d1126a-host-run-netns\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.067278 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/04c84986-75a1-4683-8764-8f7307d1126a-host-var-lib-kubelet\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.067302 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/04c84986-75a1-4683-8764-8f7307d1126a-multus-conf-dir\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.067352 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/04c84986-75a1-4683-8764-8f7307d1126a-multus-daemon-config\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.067378 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/04c84986-75a1-4683-8764-8f7307d1126a-host-run-multus-certs\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.067434 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/04c84986-75a1-4683-8764-8f7307d1126a-cni-binary-copy\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.067496 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/04c84986-75a1-4683-8764-8f7307d1126a-host-var-lib-cni-bin\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.067524 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghrmz\" (UniqueName: \"kubernetes.io/projected/04c84986-75a1-4683-8764-8f7307d1126a-kube-api-access-ghrmz\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.067611 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/04c84986-75a1-4683-8764-8f7307d1126a-system-cni-dir\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.067689 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/04c84986-75a1-4683-8764-8f7307d1126a-hostroot\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.067733 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/04c84986-75a1-4683-8764-8f7307d1126a-etc-kubernetes\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.067829 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/04c84986-75a1-4683-8764-8f7307d1126a-cnibin\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.067856 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/04c84986-75a1-4683-8764-8f7307d1126a-host-run-k8s-cni-cncf-io\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.067922 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/04c84986-75a1-4683-8764-8f7307d1126a-host-var-lib-cni-multus\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.074913 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://059f5c846149b9005d20124d666d2964bccada1c171cb57e17c64555147b0ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ee5e5f4f56ae11d8f714c42b1c1e6044b04fa5cf4c353d6bfd08c9f3b9e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:48Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.089585 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a56e3a0-fea1-4ca5-9d2a-b0a28e16d684\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://737b3f37862e455f181b6864cab54bb6dae5c458ed9dcb9b7fda63a06d0f17fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb6cb31ce28cc0d6095e7f27187bdeb1c2ee2d3f19ac480c95ab8116bd776bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01dd91984bea279379c038d2fbb4a4ad45200b49443071e290c0bf6854af1ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 09:32:46.368533 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 09:32:46.368793 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 09:32:46.369600 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1554788403/tls.crt::/tmp/serving-cert-1554788403/tls.key\\\\\\\"\\\\nI1002 09:32:46.682405 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 09:32:46.685049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 09:32:46.685073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 09:32:46.685132 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 09:32:46.685139 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 09:32:46.690261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 09:32:46.690286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 09:32:46.690299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 09:32:46.690303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 09:32:46.690306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 09:32:46.690478 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 09:32:46.692277 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3b1c06b43a4fb138588fe71ad3f37dae41d5de647684ab490a2f4079018ac06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:48Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.120981 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:48Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.160596 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:48Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.168996 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/04c84986-75a1-4683-8764-8f7307d1126a-host-var-lib-kubelet\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.169035 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/04c84986-75a1-4683-8764-8f7307d1126a-multus-conf-dir\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.169062 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/04c84986-75a1-4683-8764-8f7307d1126a-os-release\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.169091 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/04c84986-75a1-4683-8764-8f7307d1126a-multus-socket-dir-parent\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.169118 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/04c84986-75a1-4683-8764-8f7307d1126a-host-run-netns\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.169144 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/04c84986-75a1-4683-8764-8f7307d1126a-multus-daemon-config\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.169146 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/04c84986-75a1-4683-8764-8f7307d1126a-host-var-lib-kubelet\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.169208 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/04c84986-75a1-4683-8764-8f7307d1126a-host-run-multus-certs\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.169164 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/04c84986-75a1-4683-8764-8f7307d1126a-host-run-multus-certs\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.169240 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/04c84986-75a1-4683-8764-8f7307d1126a-multus-conf-dir\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.169260 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/04c84986-75a1-4683-8764-8f7307d1126a-cni-binary-copy\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.169274 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/04c84986-75a1-4683-8764-8f7307d1126a-host-run-netns\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.169282 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/04c84986-75a1-4683-8764-8f7307d1126a-host-var-lib-cni-bin\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.169337 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/04c84986-75a1-4683-8764-8f7307d1126a-multus-socket-dir-parent\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.169344 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghrmz\" (UniqueName: \"kubernetes.io/projected/04c84986-75a1-4683-8764-8f7307d1126a-kube-api-access-ghrmz\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.169378 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/04c84986-75a1-4683-8764-8f7307d1126a-system-cni-dir\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.169399 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/04c84986-75a1-4683-8764-8f7307d1126a-hostroot\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.169418 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/04c84986-75a1-4683-8764-8f7307d1126a-etc-kubernetes\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.169421 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/04c84986-75a1-4683-8764-8f7307d1126a-os-release\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.169452 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/04c84986-75a1-4683-8764-8f7307d1126a-host-run-k8s-cni-cncf-io\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.169460 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/04c84986-75a1-4683-8764-8f7307d1126a-system-cni-dir\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.169473 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/04c84986-75a1-4683-8764-8f7307d1126a-host-var-lib-cni-multus\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.169497 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/04c84986-75a1-4683-8764-8f7307d1126a-cnibin\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.169501 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/04c84986-75a1-4683-8764-8f7307d1126a-etc-kubernetes\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.169496 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/04c84986-75a1-4683-8764-8f7307d1126a-host-var-lib-cni-bin\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.169540 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/04c84986-75a1-4683-8764-8f7307d1126a-host-var-lib-cni-multus\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.169528 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/04c84986-75a1-4683-8764-8f7307d1126a-hostroot\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.169576 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/04c84986-75a1-4683-8764-8f7307d1126a-multus-cni-dir\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.169517 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/04c84986-75a1-4683-8764-8f7307d1126a-multus-cni-dir\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.169582 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/04c84986-75a1-4683-8764-8f7307d1126a-host-run-k8s-cni-cncf-io\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.169624 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/04c84986-75a1-4683-8764-8f7307d1126a-cnibin\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.170344 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/04c84986-75a1-4683-8764-8f7307d1126a-cni-binary-copy\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.186217 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hm66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7k768\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hm66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:48Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.219710 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a56e3a0-fea1-4ca5-9d2a-b0a28e16d684\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://737b3f37862e455f181b6864cab54bb6dae5c458ed9dcb9b7fda63a06d0f17fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb6cb31ce28cc0d6095e7f27187bdeb1c2ee2d3f19ac480c95ab8116bd776bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01dd91984bea279379c038d2fbb4a4ad45200b49443071e290c0bf6854af1ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 09:32:46.368533 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 09:32:46.368793 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 09:32:46.369600 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1554788403/tls.crt::/tmp/serving-cert-1554788403/tls.key\\\\\\\"\\\\nI1002 09:32:46.682405 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 09:32:46.685049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 09:32:46.685073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 09:32:46.685132 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 09:32:46.685139 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 09:32:46.690261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 09:32:46.690286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 09:32:46.690299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 09:32:46.690303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 09:32:46.690306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 09:32:46.690478 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 09:32:46.692277 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3b1c06b43a4fb138588fe71ad3f37dae41d5de647684ab490a2f4079018ac06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:48Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.240195 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:48Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.260554 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:48Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.270442 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:32:48 crc kubenswrapper[4722]: E1002 09:32:48.270733 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:32:50.270704584 +0000 UTC m=+26.307457564 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.273604 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hm66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7k768\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hm66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:48Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.294875 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zwlns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04c84986-75a1-4683-8764-8f7307d1126a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghrmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zwlns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:48Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.329947 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://562dcc33d7454113e3dec145a8b2faba06d442892ac65301a5713c82e3ca73e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:48Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.358703 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://059f5c846149b9005d20124d666d2964bccada1c171cb57e17c64555147b0ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ee5e5f4f56ae11d8f714c42b1c1e6044b04fa5cf4c353d6bfd08c9f3b9e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:48Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.371741 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.371802 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.371827 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.371855 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:32:48 crc kubenswrapper[4722]: E1002 09:32:48.371989 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 09:32:48 crc kubenswrapper[4722]: E1002 09:32:48.372048 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 09:32:48 crc kubenswrapper[4722]: E1002 09:32:48.371998 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 09:32:48 crc kubenswrapper[4722]: E1002 09:32:48.372111 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 09:32:48 crc kubenswrapper[4722]: E1002 09:32:48.372159 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 09:32:48 crc kubenswrapper[4722]: E1002 09:32:48.372173 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 09:32:48 crc kubenswrapper[4722]: E1002 09:32:48.372128 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 09:32:48 crc kubenswrapper[4722]: E1002 09:32:48.372106 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 09:32:50.372081939 +0000 UTC m=+26.408834919 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 09:32:48 crc kubenswrapper[4722]: E1002 09:32:48.372335 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 09:32:48 crc kubenswrapper[4722]: E1002 09:32:48.372379 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 09:32:50.372355486 +0000 UTC m=+26.409108466 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 09:32:48 crc kubenswrapper[4722]: E1002 09:32:48.372403 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 09:32:50.372396027 +0000 UTC m=+26.409149007 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 09:32:48 crc kubenswrapper[4722]: E1002 09:32:48.372460 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 09:32:50.372424688 +0000 UTC m=+26.409177838 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.390645 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:48Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.403534 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:48Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.451150 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-q6h76"] Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.451834 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.452280 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jlvpl"] Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.453176 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.454538 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 02 09:32:48 crc kubenswrapper[4722]: W1002 09:32:48.454987 4722 reflector.go:561] object-"openshift-ovn-kubernetes"/"env-overrides": failed to list *v1.ConfigMap: configmaps "env-overrides" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Oct 02 09:32:48 crc kubenswrapper[4722]: E1002 09:32:48.455029 4722 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"env-overrides\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"env-overrides\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.455097 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 02 09:32:48 crc kubenswrapper[4722]: W1002 09:32:48.455224 4722 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl": failed to list *v1.Secret: secrets "ovn-kubernetes-node-dockercfg-pwtwl" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Oct 02 09:32:48 crc kubenswrapper[4722]: E1002 09:32:48.455245 4722 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-pwtwl\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"ovn-kubernetes-node-dockercfg-pwtwl\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 02 09:32:48 crc kubenswrapper[4722]: W1002 09:32:48.455551 4722 reflector.go:561] object-"openshift-ovn-kubernetes"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Oct 02 09:32:48 crc kubenswrapper[4722]: E1002 09:32:48.455592 4722 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 02 09:32:48 crc kubenswrapper[4722]: W1002 09:32:48.455787 4722 reflector.go:561] object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Oct 02 09:32:48 crc kubenswrapper[4722]: E1002 09:32:48.455811 4722 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 02 09:32:48 crc kubenswrapper[4722]: W1002 09:32:48.455843 4722 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovnkube-config": failed to list *v1.ConfigMap: configmaps "ovnkube-config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.455864 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 02 09:32:48 crc kubenswrapper[4722]: W1002 09:32:48.455899 4722 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovnkube-script-lib": failed to list *v1.ConfigMap: configmaps "ovnkube-script-lib" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Oct 02 09:32:48 crc kubenswrapper[4722]: W1002 09:32:48.455912 4722 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert": failed to list *v1.Secret: secrets "ovn-node-metrics-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Oct 02 09:32:48 crc kubenswrapper[4722]: E1002 09:32:48.455939 4722 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"ovn-node-metrics-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 02 09:32:48 crc kubenswrapper[4722]: E1002 09:32:48.455868 4722 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"ovnkube-config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 02 09:32:48 crc kubenswrapper[4722]: E1002 09:32:48.455915 4722 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"ovnkube-script-lib\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.456090 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.456645 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.457112 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-cc2g9"] Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.458042 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.459603 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.461511 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.470995 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://562dcc33d7454113e3dec145a8b2faba06d442892ac65301a5713c82e3ca73e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:48Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.472649 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c3945e65-623a-40eb-82ae-8b17bcf8827b-system-cni-dir\") pod \"multus-additional-cni-plugins-cc2g9\" (UID: \"c3945e65-623a-40eb-82ae-8b17bcf8827b\") " pod="openshift-multus/multus-additional-cni-plugins-cc2g9" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.472704 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49vfk\" (UniqueName: \"kubernetes.io/projected/c3945e65-623a-40eb-82ae-8b17bcf8827b-kube-api-access-49vfk\") pod \"multus-additional-cni-plugins-cc2g9\" (UID: \"c3945e65-623a-40eb-82ae-8b17bcf8827b\") " pod="openshift-multus/multus-additional-cni-plugins-cc2g9" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.472737 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-run-ovn\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.472755 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-host-cni-bin\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.472865 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-ovnkube-config\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.472932 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-run-systemd\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.472990 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c3945e65-623a-40eb-82ae-8b17bcf8827b-cnibin\") pod \"multus-additional-cni-plugins-cc2g9\" (UID: \"c3945e65-623a-40eb-82ae-8b17bcf8827b\") " pod="openshift-multus/multus-additional-cni-plugins-cc2g9" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.473021 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.473079 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c3945e65-623a-40eb-82ae-8b17bcf8827b-cni-binary-copy\") pod \"multus-additional-cni-plugins-cc2g9\" (UID: \"c3945e65-623a-40eb-82ae-8b17bcf8827b\") " pod="openshift-multus/multus-additional-cni-plugins-cc2g9" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.473151 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f662826c-e772-4f56-a85f-83971aaef356-proxy-tls\") pod \"machine-config-daemon-q6h76\" (UID: \"f662826c-e772-4f56-a85f-83971aaef356\") " pod="openshift-machine-config-operator/machine-config-daemon-q6h76" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.473186 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c3945e65-623a-40eb-82ae-8b17bcf8827b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cc2g9\" (UID: \"c3945e65-623a-40eb-82ae-8b17bcf8827b\") " pod="openshift-multus/multus-additional-cni-plugins-cc2g9" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.473213 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-ovn-node-metrics-cert\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.473261 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-run-openvswitch\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.473341 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-host-cni-netd\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.473375 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-env-overrides\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.473410 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-ovnkube-script-lib\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.473573 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c3945e65-623a-40eb-82ae-8b17bcf8827b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cc2g9\" (UID: \"c3945e65-623a-40eb-82ae-8b17bcf8827b\") " pod="openshift-multus/multus-additional-cni-plugins-cc2g9" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.473921 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-host-run-netns\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.473963 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-var-lib-openvswitch\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.473992 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54j47\" (UniqueName: \"kubernetes.io/projected/f662826c-e772-4f56-a85f-83971aaef356-kube-api-access-54j47\") pod \"machine-config-daemon-q6h76\" (UID: \"f662826c-e772-4f56-a85f-83971aaef356\") " pod="openshift-machine-config-operator/machine-config-daemon-q6h76" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.474023 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-host-run-ovn-kubernetes\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.474052 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c3945e65-623a-40eb-82ae-8b17bcf8827b-os-release\") pod \"multus-additional-cni-plugins-cc2g9\" (UID: \"c3945e65-623a-40eb-82ae-8b17bcf8827b\") " pod="openshift-multus/multus-additional-cni-plugins-cc2g9" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.474069 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-node-log\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.474093 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-host-kubelet\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.474129 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-systemd-units\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.474162 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-etc-openvswitch\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.474192 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pfvc\" (UniqueName: \"kubernetes.io/projected/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-kube-api-access-4pfvc\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.474210 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-host-slash\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.474228 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-log-socket\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.474272 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f662826c-e772-4f56-a85f-83971aaef356-mcd-auth-proxy-config\") pod \"machine-config-daemon-q6h76\" (UID: \"f662826c-e772-4f56-a85f-83971aaef356\") " pod="openshift-machine-config-operator/machine-config-daemon-q6h76" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.474364 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f662826c-e772-4f56-a85f-83971aaef356-rootfs\") pod \"machine-config-daemon-q6h76\" (UID: \"f662826c-e772-4f56-a85f-83971aaef356\") " pod="openshift-machine-config-operator/machine-config-daemon-q6h76" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.490194 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:48Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.505358 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:48Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.519920 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://059f5c846149b9005d20124d666d2964bccada1c171cb57e17c64555147b0ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ee5e5f4f56ae11d8f714c42b1c1e6044b04fa5cf4c353d6bfd08c9f3b9e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:48Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.531889 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:48Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.544043 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:48Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.560820 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f662826c-e772-4f56-a85f-83971aaef356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q6h76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:48Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.575124 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-run-systemd\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.575241 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c3945e65-623a-40eb-82ae-8b17bcf8827b-cnibin\") pod \"multus-additional-cni-plugins-cc2g9\" (UID: \"c3945e65-623a-40eb-82ae-8b17bcf8827b\") " pod="openshift-multus/multus-additional-cni-plugins-cc2g9" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.575303 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.575353 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c3945e65-623a-40eb-82ae-8b17bcf8827b-cni-binary-copy\") pod \"multus-additional-cni-plugins-cc2g9\" (UID: \"c3945e65-623a-40eb-82ae-8b17bcf8827b\") " pod="openshift-multus/multus-additional-cni-plugins-cc2g9" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.575358 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c3945e65-623a-40eb-82ae-8b17bcf8827b-cnibin\") pod \"multus-additional-cni-plugins-cc2g9\" (UID: \"c3945e65-623a-40eb-82ae-8b17bcf8827b\") " pod="openshift-multus/multus-additional-cni-plugins-cc2g9" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.575296 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-run-systemd\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.575379 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f662826c-e772-4f56-a85f-83971aaef356-proxy-tls\") pod \"machine-config-daemon-q6h76\" (UID: \"f662826c-e772-4f56-a85f-83971aaef356\") " pod="openshift-machine-config-operator/machine-config-daemon-q6h76" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.575471 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c3945e65-623a-40eb-82ae-8b17bcf8827b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cc2g9\" (UID: \"c3945e65-623a-40eb-82ae-8b17bcf8827b\") " pod="openshift-multus/multus-additional-cni-plugins-cc2g9" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.575499 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-ovn-node-metrics-cert\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.575510 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.575551 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-run-openvswitch\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.575527 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-run-openvswitch\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.575667 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-host-cni-netd\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.575710 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-env-overrides\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.575739 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-ovnkube-script-lib\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.575836 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-host-cni-netd\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.576030 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c3945e65-623a-40eb-82ae-8b17bcf8827b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cc2g9\" (UID: \"c3945e65-623a-40eb-82ae-8b17bcf8827b\") " pod="openshift-multus/multus-additional-cni-plugins-cc2g9" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.576085 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-host-run-netns\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.576204 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-var-lib-openvswitch\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.576243 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c3945e65-623a-40eb-82ae-8b17bcf8827b-cni-binary-copy\") pod \"multus-additional-cni-plugins-cc2g9\" (UID: \"c3945e65-623a-40eb-82ae-8b17bcf8827b\") " pod="openshift-multus/multus-additional-cni-plugins-cc2g9" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.576155 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-host-run-netns\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.576279 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-var-lib-openvswitch\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.576342 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54j47\" (UniqueName: \"kubernetes.io/projected/f662826c-e772-4f56-a85f-83971aaef356-kube-api-access-54j47\") pod \"machine-config-daemon-q6h76\" (UID: \"f662826c-e772-4f56-a85f-83971aaef356\") " pod="openshift-machine-config-operator/machine-config-daemon-q6h76" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.576371 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-host-run-ovn-kubernetes\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.576391 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c3945e65-623a-40eb-82ae-8b17bcf8827b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cc2g9\" (UID: \"c3945e65-623a-40eb-82ae-8b17bcf8827b\") " pod="openshift-multus/multus-additional-cni-plugins-cc2g9" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.576563 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-host-run-ovn-kubernetes\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.576755 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c3945e65-623a-40eb-82ae-8b17bcf8827b-os-release\") pod \"multus-additional-cni-plugins-cc2g9\" (UID: \"c3945e65-623a-40eb-82ae-8b17bcf8827b\") " pod="openshift-multus/multus-additional-cni-plugins-cc2g9" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.576792 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-node-log\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.576829 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-host-kubelet\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.576854 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-systemd-units\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.576858 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c3945e65-623a-40eb-82ae-8b17bcf8827b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cc2g9\" (UID: \"c3945e65-623a-40eb-82ae-8b17bcf8827b\") " pod="openshift-multus/multus-additional-cni-plugins-cc2g9" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.576876 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-etc-openvswitch\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.576910 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-etc-openvswitch\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.576943 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pfvc\" (UniqueName: \"kubernetes.io/projected/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-kube-api-access-4pfvc\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.576966 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-host-slash\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.576988 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-log-socket\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.576990 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c3945e65-623a-40eb-82ae-8b17bcf8827b-os-release\") pod \"multus-additional-cni-plugins-cc2g9\" (UID: \"c3945e65-623a-40eb-82ae-8b17bcf8827b\") " pod="openshift-multus/multus-additional-cni-plugins-cc2g9" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.577007 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f662826c-e772-4f56-a85f-83971aaef356-mcd-auth-proxy-config\") pod \"machine-config-daemon-q6h76\" (UID: \"f662826c-e772-4f56-a85f-83971aaef356\") " pod="openshift-machine-config-operator/machine-config-daemon-q6h76" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.577034 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-node-log\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.577038 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f662826c-e772-4f56-a85f-83971aaef356-rootfs\") pod \"machine-config-daemon-q6h76\" (UID: \"f662826c-e772-4f56-a85f-83971aaef356\") " pod="openshift-machine-config-operator/machine-config-daemon-q6h76" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.577069 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c3945e65-623a-40eb-82ae-8b17bcf8827b-system-cni-dir\") pod \"multus-additional-cni-plugins-cc2g9\" (UID: \"c3945e65-623a-40eb-82ae-8b17bcf8827b\") " pod="openshift-multus/multus-additional-cni-plugins-cc2g9" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.577095 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-systemd-units\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.577095 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49vfk\" (UniqueName: \"kubernetes.io/projected/c3945e65-623a-40eb-82ae-8b17bcf8827b-kube-api-access-49vfk\") pod \"multus-additional-cni-plugins-cc2g9\" (UID: \"c3945e65-623a-40eb-82ae-8b17bcf8827b\") " pod="openshift-multus/multus-additional-cni-plugins-cc2g9" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.577150 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-run-ovn\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.577167 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-host-cni-bin\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.577231 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-run-ovn\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.577260 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-log-socket\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.577236 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-host-cni-bin\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.577286 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f662826c-e772-4f56-a85f-83971aaef356-rootfs\") pod \"machine-config-daemon-q6h76\" (UID: \"f662826c-e772-4f56-a85f-83971aaef356\") " pod="openshift-machine-config-operator/machine-config-daemon-q6h76" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.577370 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-host-slash\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.577411 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c3945e65-623a-40eb-82ae-8b17bcf8827b-system-cni-dir\") pod \"multus-additional-cni-plugins-cc2g9\" (UID: \"c3945e65-623a-40eb-82ae-8b17bcf8827b\") " pod="openshift-multus/multus-additional-cni-plugins-cc2g9" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.577070 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-host-kubelet\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.577453 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-ovnkube-config\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.577842 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f662826c-e772-4f56-a85f-83971aaef356-mcd-auth-proxy-config\") pod \"machine-config-daemon-q6h76\" (UID: \"f662826c-e772-4f56-a85f-83971aaef356\") " pod="openshift-machine-config-operator/machine-config-daemon-q6h76" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.579746 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f662826c-e772-4f56-a85f-83971aaef356-proxy-tls\") pod \"machine-config-daemon-q6h76\" (UID: \"f662826c-e772-4f56-a85f-83971aaef356\") " pod="openshift-machine-config-operator/machine-config-daemon-q6h76" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.581883 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hm66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7k768\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hm66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:48Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.613167 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54j47\" (UniqueName: \"kubernetes.io/projected/f662826c-e772-4f56-a85f-83971aaef356-kube-api-access-54j47\") pod \"machine-config-daemon-q6h76\" (UID: \"f662826c-e772-4f56-a85f-83971aaef356\") " pod="openshift-machine-config-operator/machine-config-daemon-q6h76" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.638178 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zwlns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04c84986-75a1-4683-8764-8f7307d1126a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghrmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zwlns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:48Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.664963 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a56e3a0-fea1-4ca5-9d2a-b0a28e16d684\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://737b3f37862e455f181b6864cab54bb6dae5c458ed9dcb9b7fda63a06d0f17fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb6cb31ce28cc0d6095e7f27187bdeb1c2ee2d3f19ac480c95ab8116bd776bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01dd91984bea279379c038d2fbb4a4ad45200b49443071e290c0bf6854af1ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 09:32:46.368533 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 09:32:46.368793 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 09:32:46.369600 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1554788403/tls.crt::/tmp/serving-cert-1554788403/tls.key\\\\\\\"\\\\nI1002 09:32:46.682405 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 09:32:46.685049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 09:32:46.685073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 09:32:46.685132 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 09:32:46.685139 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 09:32:46.690261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 09:32:46.690286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 09:32:46.690299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 09:32:46.690303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 09:32:46.690306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 09:32:46.690478 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 09:32:46.692277 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3b1c06b43a4fb138588fe71ad3f37dae41d5de647684ab490a2f4079018ac06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:48Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.687433 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:48Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.701283 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:48Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.710061 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.710139 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.710070 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:32:48 crc kubenswrapper[4722]: E1002 09:32:48.710266 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:32:48 crc kubenswrapper[4722]: E1002 09:32:48.710381 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:32:48 crc kubenswrapper[4722]: E1002 09:32:48.710469 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.714889 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.715441 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.716813 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.717466 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.718480 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.719207 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.719898 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.720861 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.721564 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.723198 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.723927 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.723982 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jlvpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:48Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.725487 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.726723 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.727308 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.728555 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.729154 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.730553 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.731029 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.731967 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.733280 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.733953 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.735283 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.735779 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.736802 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.737294 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.737928 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.739164 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.739681 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.742852 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.743347 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.744287 4722 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.744411 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.745548 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3945e65-623a-40eb-82ae-8b17bcf8827b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cc2g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:48Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.747982 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.749791 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.750310 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.752482 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.753480 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.754523 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.755298 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.756894 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.757476 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.758885 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.759851 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.761153 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.761781 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.762438 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://059f5c846149b9005d20124d666d2964bccada1c171cb57e17c64555147b0ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ee5e5f4f56ae11d8f714c42b1c1e6044b04fa5cf4c353d6bfd08c9f3b9e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:48Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.763127 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.764040 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.764542 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.765478 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.765954 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.767059 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.767882 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.768998 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.769616 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.770106 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.790059 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:48Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.809640 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:48Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.822384 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f662826c-e772-4f56-a85f-83971aaef356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q6h76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:48Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.840348 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a56e3a0-fea1-4ca5-9d2a-b0a28e16d684\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://737b3f37862e455f181b6864cab54bb6dae5c458ed9dcb9b7fda63a06d0f17fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb6cb31ce28cc0d6095e7f27187bdeb1c2ee2d3f19ac480c95ab8116bd776bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01dd91984bea279379c038d2fbb4a4ad45200b49443071e290c0bf6854af1ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 09:32:46.368533 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 09:32:46.368793 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 09:32:46.369600 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1554788403/tls.crt::/tmp/serving-cert-1554788403/tls.key\\\\\\\"\\\\nI1002 09:32:46.682405 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 09:32:46.685049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 09:32:46.685073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 09:32:46.685132 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 09:32:46.685139 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 09:32:46.690261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 09:32:46.690286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 09:32:46.690299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 09:32:46.690303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 09:32:46.690306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 09:32:46.690478 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 09:32:46.692277 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3b1c06b43a4fb138588fe71ad3f37dae41d5de647684ab490a2f4079018ac06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:48Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.853918 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hm66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7k768\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hm66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:48Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.871048 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zwlns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04c84986-75a1-4683-8764-8f7307d1126a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghrmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zwlns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:48Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.872707 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" event={"ID":"f662826c-e772-4f56-a85f-83971aaef356","Type":"ContainerStarted","Data":"0bb3b602b661e119078b2307bb05ad05c7645e844b6c39a67e5e92b1e197d395"} Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.874736 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6hm66" event={"ID":"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8","Type":"ContainerStarted","Data":"9e9384fcdadf3735b0de65edc5af7e3d88b55d03043291aea0bc8cc8de945e7b"} Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.874826 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6hm66" event={"ID":"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8","Type":"ContainerStarted","Data":"8905ac87a37292a46b7c842fe2bea8c1c2c4bf90ce2eaa6de8891ef61ae306be"} Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.886370 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://562dcc33d7454113e3dec145a8b2faba06d442892ac65301a5713c82e3ca73e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:48Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.904360 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:48Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.919511 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:48Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.946073 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jlvpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:48Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.964740 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3945e65-623a-40eb-82ae-8b17bcf8827b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cc2g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:48Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:48 crc kubenswrapper[4722]: I1002 09:32:48.989430 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://059f5c846149b9005d20124d666d2964bccada1c171cb57e17c64555147b0ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ee5e5f4f56ae11d8f714c42b1c1e6044b04fa5cf4c353d6bfd08c9f3b9e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:48Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.008295 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:49Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.029346 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:49Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.047893 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f662826c-e772-4f56-a85f-83971aaef356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q6h76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:49Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.064891 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a56e3a0-fea1-4ca5-9d2a-b0a28e16d684\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://737b3f37862e455f181b6864cab54bb6dae5c458ed9dcb9b7fda63a06d0f17fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb6cb31ce28cc0d6095e7f27187bdeb1c2ee2d3f19ac480c95ab8116bd776bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01dd91984bea279379c038d2fbb4a4ad45200b49443071e290c0bf6854af1ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 09:32:46.368533 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 09:32:46.368793 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 09:32:46.369600 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1554788403/tls.crt::/tmp/serving-cert-1554788403/tls.key\\\\\\\"\\\\nI1002 09:32:46.682405 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 09:32:46.685049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 09:32:46.685073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 09:32:46.685132 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 09:32:46.685139 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 09:32:46.690261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 09:32:46.690286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 09:32:46.690299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 09:32:46.690303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 09:32:46.690306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 09:32:46.690478 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 09:32:46.692277 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3b1c06b43a4fb138588fe71ad3f37dae41d5de647684ab490a2f4079018ac06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:49Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.080945 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.084481 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hm66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9384fcdadf3735b0de65edc5af7e3d88b55d03043291aea0bc8cc8de945e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7k768\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hm66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:49Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.090869 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/04c84986-75a1-4683-8764-8f7307d1126a-multus-daemon-config\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.098259 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zwlns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04c84986-75a1-4683-8764-8f7307d1126a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghrmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zwlns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:49Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.115690 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://562dcc33d7454113e3dec145a8b2faba06d442892ac65301a5713c82e3ca73e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:49Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:49 crc kubenswrapper[4722]: E1002 09:32:49.194078 4722 projected.go:288] Couldn't get configMap openshift-multus/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Oct 02 09:32:49 crc kubenswrapper[4722]: E1002 09:32:49.194135 4722 projected.go:194] Error preparing data for projected volume kube-api-access-ghrmz for pod openshift-multus/multus-zwlns: failed to sync configmap cache: timed out waiting for the condition Oct 02 09:32:49 crc kubenswrapper[4722]: E1002 09:32:49.194246 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/04c84986-75a1-4683-8764-8f7307d1126a-kube-api-access-ghrmz podName:04c84986-75a1-4683-8764-8f7307d1126a nodeName:}" failed. No retries permitted until 2025-10-02 09:32:49.694219554 +0000 UTC m=+25.730972534 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-ghrmz" (UniqueName: "kubernetes.io/projected/04c84986-75a1-4683-8764-8f7307d1126a-kube-api-access-ghrmz") pod "multus-zwlns" (UID: "04c84986-75a1-4683-8764-8f7307d1126a") : failed to sync configmap cache: timed out waiting for the condition Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.287850 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.298120 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.307856 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.309329 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.329054 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://562dcc33d7454113e3dec145a8b2faba06d442892ac65301a5713c82e3ca73e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:49Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.335591 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.343689 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:49Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.344259 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.346925 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.351421 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49vfk\" (UniqueName: \"kubernetes.io/projected/c3945e65-623a-40eb-82ae-8b17bcf8827b-kube-api-access-49vfk\") pod \"multus-additional-cni-plugins-cc2g9\" (UID: \"c3945e65-623a-40eb-82ae-8b17bcf8827b\") " pod="openshift-multus/multus-additional-cni-plugins-cc2g9" Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.356989 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-env-overrides\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.365961 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:49Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.377092 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.391007 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jlvpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:49Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.411345 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3945e65-623a-40eb-82ae-8b17bcf8827b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cc2g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:49Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.429305 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://059f5c846149b9005d20124d666d2964bccada1c171cb57e17c64555147b0ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ee5e5f4f56ae11d8f714c42b1c1e6044b04fa5cf4c353d6bfd08c9f3b9e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:49Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.443426 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:49Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.458106 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:49Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.471677 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f662826c-e772-4f56-a85f-83971aaef356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q6h76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:49Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.486763 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a56e3a0-fea1-4ca5-9d2a-b0a28e16d684\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://737b3f37862e455f181b6864cab54bb6dae5c458ed9dcb9b7fda63a06d0f17fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb6cb31ce28cc0d6095e7f27187bdeb1c2ee2d3f19ac480c95ab8116bd776bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01dd91984bea279379c038d2fbb4a4ad45200b49443071e290c0bf6854af1ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 09:32:46.368533 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 09:32:46.368793 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 09:32:46.369600 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1554788403/tls.crt::/tmp/serving-cert-1554788403/tls.key\\\\\\\"\\\\nI1002 09:32:46.682405 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 09:32:46.685049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 09:32:46.685073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 09:32:46.685132 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 09:32:46.685139 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 09:32:46.690261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 09:32:46.690286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 09:32:46.690299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 09:32:46.690303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 09:32:46.690306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 09:32:46.690478 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 09:32:46.692277 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3b1c06b43a4fb138588fe71ad3f37dae41d5de647684ab490a2f4079018ac06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:49Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.494092 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.497494 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-ovnkube-script-lib\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.500219 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hm66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9384fcdadf3735b0de65edc5af7e3d88b55d03043291aea0bc8cc8de945e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7k768\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hm66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:49Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.515933 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zwlns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04c84986-75a1-4683-8764-8f7307d1126a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghrmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zwlns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:49Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.529775 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f662826c-e772-4f56-a85f-83971aaef356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q6h76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:49Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.546092 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://059f5c846149b9005d20124d666d2964bccada1c171cb57e17c64555147b0ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ee5e5f4f56ae11d8f714c42b1c1e6044b04fa5cf4c353d6bfd08c9f3b9e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:49Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:49 crc kubenswrapper[4722]: E1002 09:32:49.576076 4722 secret.go:188] Couldn't get secret openshift-ovn-kubernetes/ovn-node-metrics-cert: failed to sync secret cache: timed out waiting for the condition Oct 02 09:32:49 crc kubenswrapper[4722]: E1002 09:32:49.576170 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-ovn-node-metrics-cert podName:a900403a-9c51-4ca5-862f-2fbbd30cbcc3 nodeName:}" failed. No retries permitted until 2025-10-02 09:32:50.076150759 +0000 UTC m=+26.112903729 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ovn-node-metrics-cert" (UniqueName: "kubernetes.io/secret/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-ovn-node-metrics-cert") pod "ovnkube-node-jlvpl" (UID: "a900403a-9c51-4ca5-862f-2fbbd30cbcc3") : failed to sync secret cache: timed out waiting for the condition Oct 02 09:32:49 crc kubenswrapper[4722]: E1002 09:32:49.578292 4722 configmap.go:193] Couldn't get configMap openshift-ovn-kubernetes/ovnkube-config: failed to sync configmap cache: timed out waiting for the condition Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.578376 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:49Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:49 crc kubenswrapper[4722]: E1002 09:32:49.578436 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-ovnkube-config podName:a900403a-9c51-4ca5-862f-2fbbd30cbcc3 nodeName:}" failed. No retries permitted until 2025-10-02 09:32:50.078405158 +0000 UTC m=+26.115158218 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ovnkube-config" (UniqueName: "kubernetes.io/configmap/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-ovnkube-config") pod "ovnkube-node-jlvpl" (UID: "a900403a-9c51-4ca5-862f-2fbbd30cbcc3") : failed to sync configmap cache: timed out waiting for the condition Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.618279 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:49Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.661891 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a56e3a0-fea1-4ca5-9d2a-b0a28e16d684\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://737b3f37862e455f181b6864cab54bb6dae5c458ed9dcb9b7fda63a06d0f17fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb6cb31ce28cc0d6095e7f27187bdeb1c2ee2d3f19ac480c95ab8116bd776bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01dd91984bea279379c038d2fbb4a4ad45200b49443071e290c0bf6854af1ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 09:32:46.368533 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 09:32:46.368793 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 09:32:46.369600 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1554788403/tls.crt::/tmp/serving-cert-1554788403/tls.key\\\\\\\"\\\\nI1002 09:32:46.682405 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 09:32:46.685049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 09:32:46.685073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 09:32:46.685132 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 09:32:46.685139 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 09:32:46.690261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 09:32:46.690286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 09:32:46.690299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 09:32:46.690303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 09:32:46.690306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 09:32:46.690478 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 09:32:46.692277 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3b1c06b43a4fb138588fe71ad3f37dae41d5de647684ab490a2f4079018ac06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:49Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.697053 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hm66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9384fcdadf3735b0de65edc5af7e3d88b55d03043291aea0bc8cc8de945e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7k768\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hm66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:49Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.739740 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zwlns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04c84986-75a1-4683-8764-8f7307d1126a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghrmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zwlns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:49Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.779225 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8589e9ec-e8ad-4b32-8846-841b689971b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf821bcfda8c6b7f9823ecb4a9dc7844eae7b1005d41e7bfe9b47df5d66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664f4e51f677aedd16cbcaed322e25d228c87936e30996f36dd3a5b97a43def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14f04556a2e11720ac3e92ec6f1c436468001e87b2984d007f9510a0e6ff0c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546efffc2572c644b7c949abadbaa2f8c9cc0449e08bb5615d3de0504f1bb9e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:49Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.789229 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.789240 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghrmz\" (UniqueName: \"kubernetes.io/projected/04c84986-75a1-4683-8764-8f7307d1126a-kube-api-access-ghrmz\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.792799 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghrmz\" (UniqueName: \"kubernetes.io/projected/04c84986-75a1-4683-8764-8f7307d1126a-kube-api-access-ghrmz\") pod \"multus-zwlns\" (UID: \"04c84986-75a1-4683-8764-8f7307d1126a\") " pod="openshift-multus/multus-zwlns" Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.829045 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.837739 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pfvc\" (UniqueName: \"kubernetes.io/projected/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-kube-api-access-4pfvc\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.860882 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://562dcc33d7454113e3dec145a8b2faba06d442892ac65301a5713c82e3ca73e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:49Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.864006 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zwlns" Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.879509 4722 generic.go:334] "Generic (PLEG): container finished" podID="c3945e65-623a-40eb-82ae-8b17bcf8827b" containerID="7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a" exitCode=0 Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.879596 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" event={"ID":"c3945e65-623a-40eb-82ae-8b17bcf8827b","Type":"ContainerDied","Data":"7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a"} Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.879629 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" event={"ID":"c3945e65-623a-40eb-82ae-8b17bcf8827b","Type":"ContainerStarted","Data":"a5c70566f5d770aaa195a499c4667480dca0f2148c70f9022effbc0063614d13"} Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.886552 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"5a27c33baca44df058fd006a35ff882b94f64446256b26a7bcde4c22671a5ef0"} Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.889689 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" event={"ID":"f662826c-e772-4f56-a85f-83971aaef356","Type":"ContainerStarted","Data":"8b262eb3201301d5e7cc940314a920dd22bcac685c0d91f412f27647028c7447"} Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.889870 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" event={"ID":"f662826c-e772-4f56-a85f-83971aaef356","Type":"ContainerStarted","Data":"cfc66c4529a0875a210fbc2b5af96db36ab208d651068958075de46898c1e150"} Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.896381 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:49Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.908875 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 02 09:32:49 crc kubenswrapper[4722]: W1002 09:32:49.925025 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04c84986_75a1_4683_8764_8f7307d1126a.slice/crio-53fdd050e46d0a9ba115e44fda05f2d8cb54d01e4d861bf228b16233d18cf9f4 WatchSource:0}: Error finding container 53fdd050e46d0a9ba115e44fda05f2d8cb54d01e4d861bf228b16233d18cf9f4: Status 404 returned error can't find the container with id 53fdd050e46d0a9ba115e44fda05f2d8cb54d01e4d861bf228b16233d18cf9f4 Oct 02 09:32:49 crc kubenswrapper[4722]: I1002 09:32:49.958282 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:49Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.004127 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jlvpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:50Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.009666 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.062596 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3945e65-623a-40eb-82ae-8b17bcf8827b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cc2g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:50Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.092294 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-ovnkube-config\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.092397 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-ovn-node-metrics-cert\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.093387 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-ovnkube-config\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.097091 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-ovn-node-metrics-cert\") pod \"ovnkube-node-jlvpl\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.097618 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:50Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.139445 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:50Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.215492 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jlvpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:50Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.255193 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3945e65-623a-40eb-82ae-8b17bcf8827b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cc2g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:50Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.270148 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://059f5c846149b9005d20124d666d2964bccada1c171cb57e17c64555147b0ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ee5e5f4f56ae11d8f714c42b1c1e6044b04fa5cf4c353d6bfd08c9f3b9e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:50Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.271233 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:50 crc kubenswrapper[4722]: W1002 09:32:50.284058 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda900403a_9c51_4ca5_862f_2fbbd30cbcc3.slice/crio-852db608e5f10cf96b37d5670b47f7763fbe2e293ec97cd6faf3a5d2bc626dfd WatchSource:0}: Error finding container 852db608e5f10cf96b37d5670b47f7763fbe2e293ec97cd6faf3a5d2bc626dfd: Status 404 returned error can't find the container with id 852db608e5f10cf96b37d5670b47f7763fbe2e293ec97cd6faf3a5d2bc626dfd Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.295742 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:32:50 crc kubenswrapper[4722]: E1002 09:32:50.296006 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:32:54.295972844 +0000 UTC m=+30.332725824 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.302601 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a27c33baca44df058fd006a35ff882b94f64446256b26a7bcde4c22671a5ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:50Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.338573 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:50Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.353293 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-tsdt2"] Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.353877 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tsdt2" Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.378875 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f662826c-e772-4f56-a85f-83971aaef356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b262eb3201301d5e7cc940314a920dd22bcac685c0d91f412f27647028c7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc66c4529a0875a210fbc2b5af96db36ab208d651068958075de46898c1e150\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q6h76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:50Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.390396 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.397510 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fb2b1d65-2472-468b-8995-eb1943a63a04-serviceca\") pod \"node-ca-tsdt2\" (UID: \"fb2b1d65-2472-468b-8995-eb1943a63a04\") " pod="openshift-image-registry/node-ca-tsdt2" Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.397553 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skk8j\" (UniqueName: \"kubernetes.io/projected/fb2b1d65-2472-468b-8995-eb1943a63a04-kube-api-access-skk8j\") pod \"node-ca-tsdt2\" (UID: \"fb2b1d65-2472-468b-8995-eb1943a63a04\") " pod="openshift-image-registry/node-ca-tsdt2" Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.397590 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.397622 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fb2b1d65-2472-468b-8995-eb1943a63a04-host\") pod \"node-ca-tsdt2\" (UID: \"fb2b1d65-2472-468b-8995-eb1943a63a04\") " pod="openshift-image-registry/node-ca-tsdt2" Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.397650 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.397674 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.397702 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:32:50 crc kubenswrapper[4722]: E1002 09:32:50.397801 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 09:32:50 crc kubenswrapper[4722]: E1002 09:32:50.397817 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 09:32:50 crc kubenswrapper[4722]: E1002 09:32:50.397881 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 09:32:54.397860712 +0000 UTC m=+30.434613692 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 09:32:50 crc kubenswrapper[4722]: E1002 09:32:50.397930 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 09:32:54.397903253 +0000 UTC m=+30.434656443 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 09:32:50 crc kubenswrapper[4722]: E1002 09:32:50.397835 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 09:32:50 crc kubenswrapper[4722]: E1002 09:32:50.397969 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 09:32:50 crc kubenswrapper[4722]: E1002 09:32:50.397985 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 09:32:50 crc kubenswrapper[4722]: E1002 09:32:50.397841 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 09:32:50 crc kubenswrapper[4722]: E1002 09:32:50.398050 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 09:32:50 crc kubenswrapper[4722]: E1002 09:32:50.398065 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 09:32:50 crc kubenswrapper[4722]: E1002 09:32:50.398027 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 09:32:54.398011346 +0000 UTC m=+30.434764516 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 09:32:50 crc kubenswrapper[4722]: E1002 09:32:50.398126 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 09:32:54.398111419 +0000 UTC m=+30.434864599 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.409046 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.429534 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.449585 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.499288 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fb2b1d65-2472-468b-8995-eb1943a63a04-serviceca\") pod \"node-ca-tsdt2\" (UID: \"fb2b1d65-2472-468b-8995-eb1943a63a04\") " pod="openshift-image-registry/node-ca-tsdt2" Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.499356 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skk8j\" (UniqueName: \"kubernetes.io/projected/fb2b1d65-2472-468b-8995-eb1943a63a04-kube-api-access-skk8j\") pod \"node-ca-tsdt2\" (UID: \"fb2b1d65-2472-468b-8995-eb1943a63a04\") " pod="openshift-image-registry/node-ca-tsdt2" Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.499412 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fb2b1d65-2472-468b-8995-eb1943a63a04-host\") pod \"node-ca-tsdt2\" (UID: \"fb2b1d65-2472-468b-8995-eb1943a63a04\") " pod="openshift-image-registry/node-ca-tsdt2" Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.499542 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fb2b1d65-2472-468b-8995-eb1943a63a04-host\") pod \"node-ca-tsdt2\" (UID: \"fb2b1d65-2472-468b-8995-eb1943a63a04\") " pod="openshift-image-registry/node-ca-tsdt2" Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.499600 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a56e3a0-fea1-4ca5-9d2a-b0a28e16d684\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://737b3f37862e455f181b6864cab54bb6dae5c458ed9dcb9b7fda63a06d0f17fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb6cb31ce28cc0d6095e7f27187bdeb1c2ee2d3f19ac480c95ab8116bd776bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01dd91984bea279379c038d2fbb4a4ad45200b49443071e290c0bf6854af1ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 09:32:46.368533 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 09:32:46.368793 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 09:32:46.369600 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1554788403/tls.crt::/tmp/serving-cert-1554788403/tls.key\\\\\\\"\\\\nI1002 09:32:46.682405 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 09:32:46.685049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 09:32:46.685073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 09:32:46.685132 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 09:32:46.685139 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 09:32:46.690261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 09:32:46.690286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 09:32:46.690299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 09:32:46.690303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 09:32:46.690306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 09:32:46.690478 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 09:32:46.692277 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3b1c06b43a4fb138588fe71ad3f37dae41d5de647684ab490a2f4079018ac06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:50Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.501151 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fb2b1d65-2472-468b-8995-eb1943a63a04-serviceca\") pod \"node-ca-tsdt2\" (UID: \"fb2b1d65-2472-468b-8995-eb1943a63a04\") " pod="openshift-image-registry/node-ca-tsdt2" Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.527457 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skk8j\" (UniqueName: \"kubernetes.io/projected/fb2b1d65-2472-468b-8995-eb1943a63a04-kube-api-access-skk8j\") pod \"node-ca-tsdt2\" (UID: \"fb2b1d65-2472-468b-8995-eb1943a63a04\") " pod="openshift-image-registry/node-ca-tsdt2" Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.553674 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tsdt2" Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.560731 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hm66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9384fcdadf3735b0de65edc5af7e3d88b55d03043291aea0bc8cc8de945e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7k768\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hm66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:50Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:50 crc kubenswrapper[4722]: W1002 09:32:50.573913 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb2b1d65_2472_468b_8995_eb1943a63a04.slice/crio-f2791397af4141921f758898d5d649cc7a20db5ba8543228660a57f6de958e38 WatchSource:0}: Error finding container f2791397af4141921f758898d5d649cc7a20db5ba8543228660a57f6de958e38: Status 404 returned error can't find the container with id f2791397af4141921f758898d5d649cc7a20db5ba8543228660a57f6de958e38 Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.603095 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zwlns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04c84986-75a1-4683-8764-8f7307d1126a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghrmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zwlns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:50Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.639248 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8589e9ec-e8ad-4b32-8846-841b689971b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf821bcfda8c6b7f9823ecb4a9dc7844eae7b1005d41e7bfe9b47df5d66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664f4e51f677aedd16cbcaed322e25d228c87936e30996f36dd3a5b97a43def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14f04556a2e11720ac3e92ec6f1c436468001e87b2984d007f9510a0e6ff0c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546efffc2572c644b7c949abadbaa2f8c9cc0449e08bb5615d3de0504f1bb9e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:50Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.678502 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://562dcc33d7454113e3dec145a8b2faba06d442892ac65301a5713c82e3ca73e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:50Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.711097 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.711745 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.711682 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:32:50 crc kubenswrapper[4722]: E1002 09:32:50.712007 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:32:50 crc kubenswrapper[4722]: E1002 09:32:50.712190 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:32:50 crc kubenswrapper[4722]: E1002 09:32:50.712766 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.722983 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:50Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.760164 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:50Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.804224 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jlvpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:50Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.841754 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3945e65-623a-40eb-82ae-8b17bcf8827b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cc2g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:50Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.879896 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://059f5c846149b9005d20124d666d2964bccada1c171cb57e17c64555147b0ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ee5e5f4f56ae11d8f714c42b1c1e6044b04fa5cf4c353d6bfd08c9f3b9e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:50Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.895406 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tsdt2" event={"ID":"fb2b1d65-2472-468b-8995-eb1943a63a04","Type":"ContainerStarted","Data":"61e98c56ccc33119647035cca69beb3a8def018e7bb0cae463c6910e18e0f64a"} Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.895466 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tsdt2" event={"ID":"fb2b1d65-2472-468b-8995-eb1943a63a04","Type":"ContainerStarted","Data":"f2791397af4141921f758898d5d649cc7a20db5ba8543228660a57f6de958e38"} Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.898098 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" event={"ID":"c3945e65-623a-40eb-82ae-8b17bcf8827b","Type":"ContainerStarted","Data":"ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97"} Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.899526 4722 generic.go:334] "Generic (PLEG): container finished" podID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerID="7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346" exitCode=0 Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.899608 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" event={"ID":"a900403a-9c51-4ca5-862f-2fbbd30cbcc3","Type":"ContainerDied","Data":"7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346"} Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.899691 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" event={"ID":"a900403a-9c51-4ca5-862f-2fbbd30cbcc3","Type":"ContainerStarted","Data":"852db608e5f10cf96b37d5670b47f7763fbe2e293ec97cd6faf3a5d2bc626dfd"} Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.901510 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zwlns" event={"ID":"04c84986-75a1-4683-8764-8f7307d1126a","Type":"ContainerStarted","Data":"460204a345893172e65dd3415be30a96dd48d7bce44d05971133cde3e43939af"} Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.901558 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zwlns" event={"ID":"04c84986-75a1-4683-8764-8f7307d1126a","Type":"ContainerStarted","Data":"53fdd050e46d0a9ba115e44fda05f2d8cb54d01e4d861bf228b16233d18cf9f4"} Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.918802 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a27c33baca44df058fd006a35ff882b94f64446256b26a7bcde4c22671a5ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:50Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:50 crc kubenswrapper[4722]: I1002 09:32:50.959739 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:50Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:51 crc kubenswrapper[4722]: I1002 09:32:51.000153 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f662826c-e772-4f56-a85f-83971aaef356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b262eb3201301d5e7cc940314a920dd22bcac685c0d91f412f27647028c7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc66c4529a0875a210fbc2b5af96db36ab208d651068958075de46898c1e150\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q6h76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:50Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:51 crc kubenswrapper[4722]: I1002 09:32:51.035797 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tsdt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb2b1d65-2472-468b-8995-eb1943a63a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tsdt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:51Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:51 crc kubenswrapper[4722]: I1002 09:32:51.080348 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a56e3a0-fea1-4ca5-9d2a-b0a28e16d684\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://737b3f37862e455f181b6864cab54bb6dae5c458ed9dcb9b7fda63a06d0f17fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb6cb31ce28cc0d6095e7f27187bdeb1c2ee2d3f19ac480c95ab8116bd776bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01dd91984bea279379c038d2fbb4a4ad45200b49443071e290c0bf6854af1ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 09:32:46.368533 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 09:32:46.368793 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 09:32:46.369600 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1554788403/tls.crt::/tmp/serving-cert-1554788403/tls.key\\\\\\\"\\\\nI1002 09:32:46.682405 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 09:32:46.685049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 09:32:46.685073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 09:32:46.685132 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 09:32:46.685139 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 09:32:46.690261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 09:32:46.690286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 09:32:46.690299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 09:32:46.690303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 09:32:46.690306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 09:32:46.690478 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 09:32:46.692277 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3b1c06b43a4fb138588fe71ad3f37dae41d5de647684ab490a2f4079018ac06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:51Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:51 crc kubenswrapper[4722]: I1002 09:32:51.119760 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hm66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9384fcdadf3735b0de65edc5af7e3d88b55d03043291aea0bc8cc8de945e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7k768\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hm66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:51Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:51 crc kubenswrapper[4722]: I1002 09:32:51.158161 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zwlns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04c84986-75a1-4683-8764-8f7307d1126a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghrmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zwlns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:51Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:51 crc kubenswrapper[4722]: I1002 09:32:51.202194 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8589e9ec-e8ad-4b32-8846-841b689971b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf821bcfda8c6b7f9823ecb4a9dc7844eae7b1005d41e7bfe9b47df5d66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664f4e51f677aedd16cbcaed322e25d228c87936e30996f36dd3a5b97a43def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14f04556a2e11720ac3e92ec6f1c436468001e87b2984d007f9510a0e6ff0c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546efffc2572c644b7c949abadbaa2f8c9cc0449e08bb5615d3de0504f1bb9e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:51Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:51 crc kubenswrapper[4722]: I1002 09:32:51.239838 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://562dcc33d7454113e3dec145a8b2faba06d442892ac65301a5713c82e3ca73e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:51Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:51 crc kubenswrapper[4722]: I1002 09:32:51.279526 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://059f5c846149b9005d20124d666d2964bccada1c171cb57e17c64555147b0ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ee5e5f4f56ae11d8f714c42b1c1e6044b04fa5cf4c353d6bfd08c9f3b9e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:51Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:51 crc kubenswrapper[4722]: I1002 09:32:51.318233 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a27c33baca44df058fd006a35ff882b94f64446256b26a7bcde4c22671a5ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:51Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:51 crc kubenswrapper[4722]: I1002 09:32:51.366036 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:51Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:51 crc kubenswrapper[4722]: I1002 09:32:51.397559 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f662826c-e772-4f56-a85f-83971aaef356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b262eb3201301d5e7cc940314a920dd22bcac685c0d91f412f27647028c7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc66c4529a0875a210fbc2b5af96db36ab208d651068958075de46898c1e150\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q6h76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:51Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:51 crc kubenswrapper[4722]: I1002 09:32:51.438476 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tsdt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb2b1d65-2472-468b-8995-eb1943a63a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e98c56ccc33119647035cca69beb3a8def018e7bb0cae463c6910e18e0f64a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tsdt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:51Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:51 crc kubenswrapper[4722]: I1002 09:32:51.481835 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a56e3a0-fea1-4ca5-9d2a-b0a28e16d684\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://737b3f37862e455f181b6864cab54bb6dae5c458ed9dcb9b7fda63a06d0f17fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb6cb31ce28cc0d6095e7f27187bdeb1c2ee2d3f19ac480c95ab8116bd776bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01dd91984bea279379c038d2fbb4a4ad45200b49443071e290c0bf6854af1ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 09:32:46.368533 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 09:32:46.368793 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 09:32:46.369600 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1554788403/tls.crt::/tmp/serving-cert-1554788403/tls.key\\\\\\\"\\\\nI1002 09:32:46.682405 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 09:32:46.685049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 09:32:46.685073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 09:32:46.685132 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 09:32:46.685139 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 09:32:46.690261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 09:32:46.690286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 09:32:46.690299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 09:32:46.690303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 09:32:46.690306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 09:32:46.690478 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 09:32:46.692277 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3b1c06b43a4fb138588fe71ad3f37dae41d5de647684ab490a2f4079018ac06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:51Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:51 crc kubenswrapper[4722]: I1002 09:32:51.518270 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hm66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9384fcdadf3735b0de65edc5af7e3d88b55d03043291aea0bc8cc8de945e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7k768\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hm66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:51Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:51 crc kubenswrapper[4722]: I1002 09:32:51.563774 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zwlns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04c84986-75a1-4683-8764-8f7307d1126a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460204a345893172e65dd3415be30a96dd48d7bce44d05971133cde3e43939af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghrmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zwlns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:51Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:51 crc kubenswrapper[4722]: I1002 09:32:51.598237 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8589e9ec-e8ad-4b32-8846-841b689971b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf821bcfda8c6b7f9823ecb4a9dc7844eae7b1005d41e7bfe9b47df5d66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664f4e51f677aedd16cbcaed322e25d228c87936e30996f36dd3a5b97a43def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14f04556a2e11720ac3e92ec6f1c436468001e87b2984d007f9510a0e6ff0c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546efffc2572c644b7c949abadbaa2f8c9cc0449e08bb5615d3de0504f1bb9e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:51Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:51 crc kubenswrapper[4722]: I1002 09:32:51.640442 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://562dcc33d7454113e3dec145a8b2faba06d442892ac65301a5713c82e3ca73e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:51Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:51 crc kubenswrapper[4722]: I1002 09:32:51.679166 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:51Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:51 crc kubenswrapper[4722]: I1002 09:32:51.721065 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:51Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:51 crc kubenswrapper[4722]: I1002 09:32:51.762971 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jlvpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:51Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:51 crc kubenswrapper[4722]: I1002 09:32:51.799904 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3945e65-623a-40eb-82ae-8b17bcf8827b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cc2g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:51Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:51 crc kubenswrapper[4722]: I1002 09:32:51.906937 4722 generic.go:334] "Generic (PLEG): container finished" podID="c3945e65-623a-40eb-82ae-8b17bcf8827b" containerID="ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97" exitCode=0 Oct 02 09:32:51 crc kubenswrapper[4722]: I1002 09:32:51.907014 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" event={"ID":"c3945e65-623a-40eb-82ae-8b17bcf8827b","Type":"ContainerDied","Data":"ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97"} Oct 02 09:32:51 crc kubenswrapper[4722]: I1002 09:32:51.912419 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" event={"ID":"a900403a-9c51-4ca5-862f-2fbbd30cbcc3","Type":"ContainerStarted","Data":"04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c"} Oct 02 09:32:51 crc kubenswrapper[4722]: I1002 09:32:51.912474 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" event={"ID":"a900403a-9c51-4ca5-862f-2fbbd30cbcc3","Type":"ContainerStarted","Data":"f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453"} Oct 02 09:32:51 crc kubenswrapper[4722]: I1002 09:32:51.912488 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" event={"ID":"a900403a-9c51-4ca5-862f-2fbbd30cbcc3","Type":"ContainerStarted","Data":"6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a"} Oct 02 09:32:51 crc kubenswrapper[4722]: I1002 09:32:51.912502 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" event={"ID":"a900403a-9c51-4ca5-862f-2fbbd30cbcc3","Type":"ContainerStarted","Data":"9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389"} Oct 02 09:32:51 crc kubenswrapper[4722]: I1002 09:32:51.912513 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" event={"ID":"a900403a-9c51-4ca5-862f-2fbbd30cbcc3","Type":"ContainerStarted","Data":"e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4"} Oct 02 09:32:51 crc kubenswrapper[4722]: I1002 09:32:51.912525 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" event={"ID":"a900403a-9c51-4ca5-862f-2fbbd30cbcc3","Type":"ContainerStarted","Data":"685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a"} Oct 02 09:32:51 crc kubenswrapper[4722]: I1002 09:32:51.924917 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:51Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:51 crc kubenswrapper[4722]: I1002 09:32:51.941304 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:51Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:51 crc kubenswrapper[4722]: I1002 09:32:51.960665 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jlvpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:51Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:51 crc kubenswrapper[4722]: I1002 09:32:51.975134 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3945e65-623a-40eb-82ae-8b17bcf8827b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cc2g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:51Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.001175 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://059f5c846149b9005d20124d666d2964bccada1c171cb57e17c64555147b0ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ee5e5f4f56ae11d8f714c42b1c1e6044b04fa5cf4c353d6bfd08c9f3b9e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:51Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.037787 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a27c33baca44df058fd006a35ff882b94f64446256b26a7bcde4c22671a5ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:52Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.077259 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:52Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.117957 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f662826c-e772-4f56-a85f-83971aaef356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b262eb3201301d5e7cc940314a920dd22bcac685c0d91f412f27647028c7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc66c4529a0875a210fbc2b5af96db36ab208d651068958075de46898c1e150\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q6h76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:52Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.157478 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tsdt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb2b1d65-2472-468b-8995-eb1943a63a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e98c56ccc33119647035cca69beb3a8def018e7bb0cae463c6910e18e0f64a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tsdt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:52Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.201913 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a56e3a0-fea1-4ca5-9d2a-b0a28e16d684\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://737b3f37862e455f181b6864cab54bb6dae5c458ed9dcb9b7fda63a06d0f17fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb6cb31ce28cc0d6095e7f27187bdeb1c2ee2d3f19ac480c95ab8116bd776bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01dd91984bea279379c038d2fbb4a4ad45200b49443071e290c0bf6854af1ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 09:32:46.368533 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 09:32:46.368793 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 09:32:46.369600 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1554788403/tls.crt::/tmp/serving-cert-1554788403/tls.key\\\\\\\"\\\\nI1002 09:32:46.682405 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 09:32:46.685049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 09:32:46.685073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 09:32:46.685132 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 09:32:46.685139 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 09:32:46.690261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 09:32:46.690286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 09:32:46.690299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 09:32:46.690303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 09:32:46.690306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 09:32:46.690478 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 09:32:46.692277 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3b1c06b43a4fb138588fe71ad3f37dae41d5de647684ab490a2f4079018ac06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:52Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.237606 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hm66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9384fcdadf3735b0de65edc5af7e3d88b55d03043291aea0bc8cc8de945e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7k768\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hm66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:52Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.280100 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zwlns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04c84986-75a1-4683-8764-8f7307d1126a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460204a345893172e65dd3415be30a96dd48d7bce44d05971133cde3e43939af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghrmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zwlns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:52Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.316010 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8589e9ec-e8ad-4b32-8846-841b689971b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf821bcfda8c6b7f9823ecb4a9dc7844eae7b1005d41e7bfe9b47df5d66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664f4e51f677aedd16cbcaed322e25d228c87936e30996f36dd3a5b97a43def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14f04556a2e11720ac3e92ec6f1c436468001e87b2984d007f9510a0e6ff0c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546efffc2572c644b7c949abadbaa2f8c9cc0449e08bb5615d3de0504f1bb9e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:52Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.359740 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://562dcc33d7454113e3dec145a8b2faba06d442892ac65301a5713c82e3ca73e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:52Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.441786 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.443767 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.443839 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.443858 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.444002 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.452280 4722 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.452957 4722 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.454779 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.454827 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.454839 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.454860 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.454875 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:52Z","lastTransitionTime":"2025-10-02T09:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:52 crc kubenswrapper[4722]: E1002 09:32:52.472084 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"680c7bd3-d5b0-4b5c-92c1-2fb03a628b44\\\",\\\"systemUUID\\\":\\\"8b109f7e-b539-4abd-be49-fd9e033b3339\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:52Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.476900 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.476948 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.476961 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.476995 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.477021 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:52Z","lastTransitionTime":"2025-10-02T09:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:52 crc kubenswrapper[4722]: E1002 09:32:52.494057 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"680c7bd3-d5b0-4b5c-92c1-2fb03a628b44\\\",\\\"systemUUID\\\":\\\"8b109f7e-b539-4abd-be49-fd9e033b3339\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:52Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.498734 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.498791 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.498802 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.498822 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.498834 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:52Z","lastTransitionTime":"2025-10-02T09:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:52 crc kubenswrapper[4722]: E1002 09:32:52.518420 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"680c7bd3-d5b0-4b5c-92c1-2fb03a628b44\\\",\\\"systemUUID\\\":\\\"8b109f7e-b539-4abd-be49-fd9e033b3339\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:52Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.523268 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.523338 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.523350 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.523369 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.523380 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:52Z","lastTransitionTime":"2025-10-02T09:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:52 crc kubenswrapper[4722]: E1002 09:32:52.537269 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"680c7bd3-d5b0-4b5c-92c1-2fb03a628b44\\\",\\\"systemUUID\\\":\\\"8b109f7e-b539-4abd-be49-fd9e033b3339\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:52Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.541724 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.541781 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.541796 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.541816 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.541831 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:52Z","lastTransitionTime":"2025-10-02T09:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:52 crc kubenswrapper[4722]: E1002 09:32:52.556441 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"680c7bd3-d5b0-4b5c-92c1-2fb03a628b44\\\",\\\"systemUUID\\\":\\\"8b109f7e-b539-4abd-be49-fd9e033b3339\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:52Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:52 crc kubenswrapper[4722]: E1002 09:32:52.556581 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.558833 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.558872 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.558880 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.558902 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.558913 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:52Z","lastTransitionTime":"2025-10-02T09:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.661899 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.661935 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.661944 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.661960 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.661969 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:52Z","lastTransitionTime":"2025-10-02T09:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.710122 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.710180 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.710201 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:32:52 crc kubenswrapper[4722]: E1002 09:32:52.710308 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:32:52 crc kubenswrapper[4722]: E1002 09:32:52.710672 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:32:52 crc kubenswrapper[4722]: E1002 09:32:52.710770 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.765542 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.765601 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.765611 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.765629 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.765647 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:52Z","lastTransitionTime":"2025-10-02T09:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.868679 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.868722 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.868732 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.868752 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.868766 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:52Z","lastTransitionTime":"2025-10-02T09:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.920087 4722 generic.go:334] "Generic (PLEG): container finished" podID="c3945e65-623a-40eb-82ae-8b17bcf8827b" containerID="2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93" exitCode=0 Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.920163 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" event={"ID":"c3945e65-623a-40eb-82ae-8b17bcf8827b","Type":"ContainerDied","Data":"2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93"} Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.942788 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8589e9ec-e8ad-4b32-8846-841b689971b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf821bcfda8c6b7f9823ecb4a9dc7844eae7b1005d41e7bfe9b47df5d66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664f4e51f677aedd16cbcaed322e25d228c87936e30996f36dd3a5b97a43def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14f04556a2e11720ac3e92ec6f1c436468001e87b2984d007f9510a0e6ff0c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546efffc2572c644b7c949abadbaa2f8c9cc0449e08bb5615d3de0504f1bb9e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:52Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.958427 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://562dcc33d7454113e3dec145a8b2faba06d442892ac65301a5713c82e3ca73e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:52Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.972387 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.972430 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.972438 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.972453 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.972467 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:52Z","lastTransitionTime":"2025-10-02T09:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.973002 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:52Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:52 crc kubenswrapper[4722]: I1002 09:32:52.991903 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:52Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.028621 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jlvpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:53Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.046526 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3945e65-623a-40eb-82ae-8b17bcf8827b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cc2g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:53Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.061416 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tsdt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb2b1d65-2472-468b-8995-eb1943a63a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e98c56ccc33119647035cca69beb3a8def018e7bb0cae463c6910e18e0f64a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tsdt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:53Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.080865 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.080922 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.080932 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.080951 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.080966 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:53Z","lastTransitionTime":"2025-10-02T09:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.082326 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://059f5c846149b9005d20124d666d2964bccada1c171cb57e17c64555147b0ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ee5e5f4f56ae11d8f714c42b1c1e6044b04fa5cf4c353d6bfd08c9f3b9e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:53Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.098904 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a27c33baca44df058fd006a35ff882b94f64446256b26a7bcde4c22671a5ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:53Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.111818 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:53Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.123660 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f662826c-e772-4f56-a85f-83971aaef356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b262eb3201301d5e7cc940314a920dd22bcac685c0d91f412f27647028c7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc66c4529a0875a210fbc2b5af96db36ab208d651068958075de46898c1e150\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q6h76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:53Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.139089 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a56e3a0-fea1-4ca5-9d2a-b0a28e16d684\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://737b3f37862e455f181b6864cab54bb6dae5c458ed9dcb9b7fda63a06d0f17fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb6cb31ce28cc0d6095e7f27187bdeb1c2ee2d3f19ac480c95ab8116bd776bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01dd91984bea279379c038d2fbb4a4ad45200b49443071e290c0bf6854af1ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 09:32:46.368533 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 09:32:46.368793 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 09:32:46.369600 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1554788403/tls.crt::/tmp/serving-cert-1554788403/tls.key\\\\\\\"\\\\nI1002 09:32:46.682405 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 09:32:46.685049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 09:32:46.685073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 09:32:46.685132 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 09:32:46.685139 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 09:32:46.690261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 09:32:46.690286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 09:32:46.690299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 09:32:46.690303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 09:32:46.690306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 09:32:46.690478 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 09:32:46.692277 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3b1c06b43a4fb138588fe71ad3f37dae41d5de647684ab490a2f4079018ac06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:53Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.150529 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hm66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9384fcdadf3735b0de65edc5af7e3d88b55d03043291aea0bc8cc8de945e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7k768\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hm66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:53Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.164125 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zwlns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04c84986-75a1-4683-8764-8f7307d1126a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460204a345893172e65dd3415be30a96dd48d7bce44d05971133cde3e43939af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghrmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zwlns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:53Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.183278 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.183341 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.183376 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.183393 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.183405 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:53Z","lastTransitionTime":"2025-10-02T09:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.286338 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.286403 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.286419 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.286442 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.286457 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:53Z","lastTransitionTime":"2025-10-02T09:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.389063 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.389109 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.389119 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.389136 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.389148 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:53Z","lastTransitionTime":"2025-10-02T09:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.492860 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.492920 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.492936 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.492957 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.492971 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:53Z","lastTransitionTime":"2025-10-02T09:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.602360 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.602406 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.602414 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.602433 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.602444 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:53Z","lastTransitionTime":"2025-10-02T09:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.705179 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.705241 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.705255 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.705277 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.705293 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:53Z","lastTransitionTime":"2025-10-02T09:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.807735 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.807785 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.807797 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.807824 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.807837 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:53Z","lastTransitionTime":"2025-10-02T09:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.910463 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.910502 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.910516 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.910533 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.910545 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:53Z","lastTransitionTime":"2025-10-02T09:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.928591 4722 generic.go:334] "Generic (PLEG): container finished" podID="c3945e65-623a-40eb-82ae-8b17bcf8827b" containerID="aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c" exitCode=0 Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.928688 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" event={"ID":"c3945e65-623a-40eb-82ae-8b17bcf8827b","Type":"ContainerDied","Data":"aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c"} Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.933202 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" event={"ID":"a900403a-9c51-4ca5-862f-2fbbd30cbcc3","Type":"ContainerStarted","Data":"3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28"} Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.947950 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:53Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.963673 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:53Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:53 crc kubenswrapper[4722]: I1002 09:32:53.988867 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jlvpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:53Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.006017 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3945e65-623a-40eb-82ae-8b17bcf8827b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cc2g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:54Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.013784 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.013817 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.013826 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.013842 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.013856 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:54Z","lastTransitionTime":"2025-10-02T09:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.022136 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f662826c-e772-4f56-a85f-83971aaef356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b262eb3201301d5e7cc940314a920dd22bcac685c0d91f412f27647028c7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc66c4529a0875a210fbc2b5af96db36ab208d651068958075de46898c1e150\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q6h76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:54Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.033296 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tsdt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb2b1d65-2472-468b-8995-eb1943a63a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e98c56ccc33119647035cca69beb3a8def018e7bb0cae463c6910e18e0f64a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tsdt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:54Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.048846 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://059f5c846149b9005d20124d666d2964bccada1c171cb57e17c64555147b0ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ee5e5f4f56ae11d8f714c42b1c1e6044b04fa5cf4c353d6bfd08c9f3b9e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:54Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.063200 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a27c33baca44df058fd006a35ff882b94f64446256b26a7bcde4c22671a5ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:54Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.076188 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:54Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.089854 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a56e3a0-fea1-4ca5-9d2a-b0a28e16d684\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://737b3f37862e455f181b6864cab54bb6dae5c458ed9dcb9b7fda63a06d0f17fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb6cb31ce28cc0d6095e7f27187bdeb1c2ee2d3f19ac480c95ab8116bd776bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01dd91984bea279379c038d2fbb4a4ad45200b49443071e290c0bf6854af1ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 09:32:46.368533 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 09:32:46.368793 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 09:32:46.369600 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1554788403/tls.crt::/tmp/serving-cert-1554788403/tls.key\\\\\\\"\\\\nI1002 09:32:46.682405 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 09:32:46.685049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 09:32:46.685073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 09:32:46.685132 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 09:32:46.685139 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 09:32:46.690261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 09:32:46.690286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 09:32:46.690299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 09:32:46.690303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 09:32:46.690306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 09:32:46.690478 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 09:32:46.692277 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3b1c06b43a4fb138588fe71ad3f37dae41d5de647684ab490a2f4079018ac06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:54Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.105229 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hm66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9384fcdadf3735b0de65edc5af7e3d88b55d03043291aea0bc8cc8de945e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7k768\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hm66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:54Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.116505 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.116554 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.116562 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.116588 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.116600 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:54Z","lastTransitionTime":"2025-10-02T09:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.120406 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zwlns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04c84986-75a1-4683-8764-8f7307d1126a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460204a345893172e65dd3415be30a96dd48d7bce44d05971133cde3e43939af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghrmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zwlns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:54Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.133989 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8589e9ec-e8ad-4b32-8846-841b689971b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf821bcfda8c6b7f9823ecb4a9dc7844eae7b1005d41e7bfe9b47df5d66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664f4e51f677aedd16cbcaed322e25d228c87936e30996f36dd3a5b97a43def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14f04556a2e11720ac3e92ec6f1c436468001e87b2984d007f9510a0e6ff0c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546efffc2572c644b7c949abadbaa2f8c9cc0449e08bb5615d3de0504f1bb9e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:54Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.151781 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://562dcc33d7454113e3dec145a8b2faba06d442892ac65301a5713c82e3ca73e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:54Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.219479 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.219533 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.219546 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.219568 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.219582 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:54Z","lastTransitionTime":"2025-10-02T09:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.324349 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.329176 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.329299 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.329372 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.329386 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:54Z","lastTransitionTime":"2025-10-02T09:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.341409 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:32:54 crc kubenswrapper[4722]: E1002 09:32:54.341645 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:33:02.341610389 +0000 UTC m=+38.378363369 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.431806 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.431873 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.431887 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.431908 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.431923 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:54Z","lastTransitionTime":"2025-10-02T09:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.442388 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.442452 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.442473 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.442492 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:32:54 crc kubenswrapper[4722]: E1002 09:32:54.442627 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 09:32:54 crc kubenswrapper[4722]: E1002 09:32:54.442681 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 09:32:54 crc kubenswrapper[4722]: E1002 09:32:54.442695 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 09:32:54 crc kubenswrapper[4722]: E1002 09:32:54.442746 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 09:33:02.442731767 +0000 UTC m=+38.479484747 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 09:32:54 crc kubenswrapper[4722]: E1002 09:32:54.443140 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 09:32:54 crc kubenswrapper[4722]: E1002 09:32:54.443190 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 09:33:02.443177378 +0000 UTC m=+38.479930358 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 09:32:54 crc kubenswrapper[4722]: E1002 09:32:54.443238 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 09:32:54 crc kubenswrapper[4722]: E1002 09:32:54.443266 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 09:33:02.44325756 +0000 UTC m=+38.480010550 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 09:32:54 crc kubenswrapper[4722]: E1002 09:32:54.443344 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 09:32:54 crc kubenswrapper[4722]: E1002 09:32:54.443361 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 09:32:54 crc kubenswrapper[4722]: E1002 09:32:54.443371 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 09:32:54 crc kubenswrapper[4722]: E1002 09:32:54.443398 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 09:33:02.443390004 +0000 UTC m=+38.480142984 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.537964 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.538495 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.538513 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.538536 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.538552 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:54Z","lastTransitionTime":"2025-10-02T09:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.642259 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.642336 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.642356 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.642381 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.642398 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:54Z","lastTransitionTime":"2025-10-02T09:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.709645 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.709718 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.709668 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:32:54 crc kubenswrapper[4722]: E1002 09:32:54.709829 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:32:54 crc kubenswrapper[4722]: E1002 09:32:54.709922 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:32:54 crc kubenswrapper[4722]: E1002 09:32:54.709995 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.738537 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8589e9ec-e8ad-4b32-8846-841b689971b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf821bcfda8c6b7f9823ecb4a9dc7844eae7b1005d41e7bfe9b47df5d66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664f4e51f677aedd16cbcaed322e25d228c87936e30996f36dd3a5b97a43def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14f04556a2e11720ac3e92ec6f1c436468001e87b2984d007f9510a0e6ff0c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546efffc2572c644b7c949abadbaa2f8c9cc0449e08bb5615d3de0504f1bb9e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:54Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.745893 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.745932 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.745945 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.745963 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.745997 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:54Z","lastTransitionTime":"2025-10-02T09:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.769896 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://562dcc33d7454113e3dec145a8b2faba06d442892ac65301a5713c82e3ca73e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:54Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.801251 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:54Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.822678 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:54Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.842393 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jlvpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:54Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.848238 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.848282 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.848292 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.848310 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.848338 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:54Z","lastTransitionTime":"2025-10-02T09:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.858916 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3945e65-623a-40eb-82ae-8b17bcf8827b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cc2g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:54Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.873012 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://059f5c846149b9005d20124d666d2964bccada1c171cb57e17c64555147b0ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ee5e5f4f56ae11d8f714c42b1c1e6044b04fa5cf4c353d6bfd08c9f3b9e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:54Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.884060 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a27c33baca44df058fd006a35ff882b94f64446256b26a7bcde4c22671a5ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:54Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.898691 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:54Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.911839 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f662826c-e772-4f56-a85f-83971aaef356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b262eb3201301d5e7cc940314a920dd22bcac685c0d91f412f27647028c7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc66c4529a0875a210fbc2b5af96db36ab208d651068958075de46898c1e150\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q6h76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:54Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.923719 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tsdt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb2b1d65-2472-468b-8995-eb1943a63a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e98c56ccc33119647035cca69beb3a8def018e7bb0cae463c6910e18e0f64a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tsdt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:54Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.938895 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a56e3a0-fea1-4ca5-9d2a-b0a28e16d684\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://737b3f37862e455f181b6864cab54bb6dae5c458ed9dcb9b7fda63a06d0f17fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb6cb31ce28cc0d6095e7f27187bdeb1c2ee2d3f19ac480c95ab8116bd776bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01dd91984bea279379c038d2fbb4a4ad45200b49443071e290c0bf6854af1ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 09:32:46.368533 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 09:32:46.368793 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 09:32:46.369600 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1554788403/tls.crt::/tmp/serving-cert-1554788403/tls.key\\\\\\\"\\\\nI1002 09:32:46.682405 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 09:32:46.685049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 09:32:46.685073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 09:32:46.685132 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 09:32:46.685139 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 09:32:46.690261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 09:32:46.690286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 09:32:46.690299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 09:32:46.690303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 09:32:46.690306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 09:32:46.690478 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 09:32:46.692277 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3b1c06b43a4fb138588fe71ad3f37dae41d5de647684ab490a2f4079018ac06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:54Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.941595 4722 generic.go:334] "Generic (PLEG): container finished" podID="c3945e65-623a-40eb-82ae-8b17bcf8827b" containerID="da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5" exitCode=0 Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.941638 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" event={"ID":"c3945e65-623a-40eb-82ae-8b17bcf8827b","Type":"ContainerDied","Data":"da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5"} Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.950490 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.950660 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.950726 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.950815 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.950872 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:54Z","lastTransitionTime":"2025-10-02T09:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.953346 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hm66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9384fcdadf3735b0de65edc5af7e3d88b55d03043291aea0bc8cc8de945e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7k768\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hm66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:54Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.970433 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zwlns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04c84986-75a1-4683-8764-8f7307d1126a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460204a345893172e65dd3415be30a96dd48d7bce44d05971133cde3e43939af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghrmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zwlns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:54Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:54 crc kubenswrapper[4722]: I1002 09:32:54.991482 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a56e3a0-fea1-4ca5-9d2a-b0a28e16d684\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://737b3f37862e455f181b6864cab54bb6dae5c458ed9dcb9b7fda63a06d0f17fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb6cb31ce28cc0d6095e7f27187bdeb1c2ee2d3f19ac480c95ab8116bd776bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01dd91984bea279379c038d2fbb4a4ad45200b49443071e290c0bf6854af1ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 09:32:46.368533 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 09:32:46.368793 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 09:32:46.369600 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1554788403/tls.crt::/tmp/serving-cert-1554788403/tls.key\\\\\\\"\\\\nI1002 09:32:46.682405 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 09:32:46.685049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 09:32:46.685073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 09:32:46.685132 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 09:32:46.685139 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 09:32:46.690261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 09:32:46.690286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 09:32:46.690299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 09:32:46.690303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 09:32:46.690306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 09:32:46.690478 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 09:32:46.692277 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3b1c06b43a4fb138588fe71ad3f37dae41d5de647684ab490a2f4079018ac06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:54Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.003867 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hm66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9384fcdadf3735b0de65edc5af7e3d88b55d03043291aea0bc8cc8de945e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7k768\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hm66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:55Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.020301 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zwlns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04c84986-75a1-4683-8764-8f7307d1126a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460204a345893172e65dd3415be30a96dd48d7bce44d05971133cde3e43939af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghrmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zwlns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:55Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.036416 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8589e9ec-e8ad-4b32-8846-841b689971b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf821bcfda8c6b7f9823ecb4a9dc7844eae7b1005d41e7bfe9b47df5d66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664f4e51f677aedd16cbcaed322e25d228c87936e30996f36dd3a5b97a43def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14f04556a2e11720ac3e92ec6f1c436468001e87b2984d007f9510a0e6ff0c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546efffc2572c644b7c949abadbaa2f8c9cc0449e08bb5615d3de0504f1bb9e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:55Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.053669 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.053731 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.053749 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.053772 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.053791 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:55Z","lastTransitionTime":"2025-10-02T09:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.055645 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://562dcc33d7454113e3dec145a8b2faba06d442892ac65301a5713c82e3ca73e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:55Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.070612 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:55Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.098622 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:55Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.122523 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jlvpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:55Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.137950 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3945e65-623a-40eb-82ae-8b17bcf8827b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cc2g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:55Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.155218 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://059f5c846149b9005d20124d666d2964bccada1c171cb57e17c64555147b0ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ee5e5f4f56ae11d8f714c42b1c1e6044b04fa5cf4c353d6bfd08c9f3b9e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:55Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.156353 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.156399 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.156412 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.156429 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.156441 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:55Z","lastTransitionTime":"2025-10-02T09:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.168357 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a27c33baca44df058fd006a35ff882b94f64446256b26a7bcde4c22671a5ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:55Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.185548 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:55Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.199574 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f662826c-e772-4f56-a85f-83971aaef356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b262eb3201301d5e7cc940314a920dd22bcac685c0d91f412f27647028c7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc66c4529a0875a210fbc2b5af96db36ab208d651068958075de46898c1e150\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q6h76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:55Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.209112 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tsdt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb2b1d65-2472-468b-8995-eb1943a63a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e98c56ccc33119647035cca69beb3a8def018e7bb0cae463c6910e18e0f64a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tsdt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:55Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.260077 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.260125 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.260135 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.260152 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.260163 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:55Z","lastTransitionTime":"2025-10-02T09:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.364016 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.364084 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.364099 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.364126 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.364142 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:55Z","lastTransitionTime":"2025-10-02T09:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.468472 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.468560 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.468591 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.468630 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.468654 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:55Z","lastTransitionTime":"2025-10-02T09:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.571682 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.571738 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.571753 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.571781 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.571794 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:55Z","lastTransitionTime":"2025-10-02T09:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.674441 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.674506 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.674518 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.674538 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.674555 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:55Z","lastTransitionTime":"2025-10-02T09:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.777417 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.777475 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.777485 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.777507 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.777526 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:55Z","lastTransitionTime":"2025-10-02T09:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.881098 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.881625 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.881645 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.881673 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.881686 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:55Z","lastTransitionTime":"2025-10-02T09:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.941790 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.942633 4722 scope.go:117] "RemoveContainer" containerID="ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02" Oct 02 09:32:55 crc kubenswrapper[4722]: E1002 09:32:55.942871 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.953889 4722 generic.go:334] "Generic (PLEG): container finished" podID="c3945e65-623a-40eb-82ae-8b17bcf8827b" containerID="1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235" exitCode=0 Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.953936 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" event={"ID":"c3945e65-623a-40eb-82ae-8b17bcf8827b","Type":"ContainerDied","Data":"1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235"} Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.978051 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://059f5c846149b9005d20124d666d2964bccada1c171cb57e17c64555147b0ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ee5e5f4f56ae11d8f714c42b1c1e6044b04fa5cf4c353d6bfd08c9f3b9e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:55Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.989069 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.989147 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.989168 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.989198 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:55 crc kubenswrapper[4722]: I1002 09:32:55.989226 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:55Z","lastTransitionTime":"2025-10-02T09:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.004574 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a27c33baca44df058fd006a35ff882b94f64446256b26a7bcde4c22671a5ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:56Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.018665 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:56Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.031450 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f662826c-e772-4f56-a85f-83971aaef356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b262eb3201301d5e7cc940314a920dd22bcac685c0d91f412f27647028c7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc66c4529a0875a210fbc2b5af96db36ab208d651068958075de46898c1e150\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q6h76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:56Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.044254 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tsdt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb2b1d65-2472-468b-8995-eb1943a63a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e98c56ccc33119647035cca69beb3a8def018e7bb0cae463c6910e18e0f64a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tsdt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:56Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.057402 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hm66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9384fcdadf3735b0de65edc5af7e3d88b55d03043291aea0bc8cc8de945e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7k768\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hm66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:56Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.076408 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zwlns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04c84986-75a1-4683-8764-8f7307d1126a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460204a345893172e65dd3415be30a96dd48d7bce44d05971133cde3e43939af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghrmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zwlns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:56Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.092719 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.092775 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.092790 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.092816 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.092835 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:56Z","lastTransitionTime":"2025-10-02T09:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.093279 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a56e3a0-fea1-4ca5-9d2a-b0a28e16d684\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://737b3f37862e455f181b6864cab54bb6dae5c458ed9dcb9b7fda63a06d0f17fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb6cb31ce28cc0d6095e7f27187bdeb1c2ee2d3f19ac480c95ab8116bd776bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01dd91984bea279379c038d2fbb4a4ad45200b49443071e290c0bf6854af1ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 09:32:46.368533 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 09:32:46.368793 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 09:32:46.369600 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1554788403/tls.crt::/tmp/serving-cert-1554788403/tls.key\\\\\\\"\\\\nI1002 09:32:46.682405 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 09:32:46.685049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 09:32:46.685073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 09:32:46.685132 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 09:32:46.685139 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 09:32:46.690261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 09:32:46.690286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 09:32:46.690299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 09:32:46.690303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 09:32:46.690306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 09:32:46.690478 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 09:32:46.692277 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3b1c06b43a4fb138588fe71ad3f37dae41d5de647684ab490a2f4079018ac06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:56Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.109114 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://562dcc33d7454113e3dec145a8b2faba06d442892ac65301a5713c82e3ca73e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:56Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.129814 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8589e9ec-e8ad-4b32-8846-841b689971b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf821bcfda8c6b7f9823ecb4a9dc7844eae7b1005d41e7bfe9b47df5d66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664f4e51f677aedd16cbcaed322e25d228c87936e30996f36dd3a5b97a43def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14f04556a2e11720ac3e92ec6f1c436468001e87b2984d007f9510a0e6ff0c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546efffc2572c644b7c949abadbaa2f8c9cc0449e08bb5615d3de0504f1bb9e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:56Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.147204 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:56Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.166656 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:56Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.196267 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.196334 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.196344 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.196361 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.196373 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:56Z","lastTransitionTime":"2025-10-02T09:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.200785 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jlvpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:56Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.218755 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3945e65-623a-40eb-82ae-8b17bcf8827b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cc2g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:56Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.299422 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.299471 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.299481 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.300017 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.300065 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:56Z","lastTransitionTime":"2025-10-02T09:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.403016 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.403077 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.403087 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.403162 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.403175 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:56Z","lastTransitionTime":"2025-10-02T09:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.506188 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.506233 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.506243 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.506260 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.506274 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:56Z","lastTransitionTime":"2025-10-02T09:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.609948 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.610007 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.610020 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.610048 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.610064 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:56Z","lastTransitionTime":"2025-10-02T09:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.709589 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.709675 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.709737 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:32:56 crc kubenswrapper[4722]: E1002 09:32:56.709905 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:32:56 crc kubenswrapper[4722]: E1002 09:32:56.710000 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:32:56 crc kubenswrapper[4722]: E1002 09:32:56.710151 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.712645 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.712708 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.712721 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.712738 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.712782 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:56Z","lastTransitionTime":"2025-10-02T09:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.817233 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.817533 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.817618 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.817647 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.817660 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:56Z","lastTransitionTime":"2025-10-02T09:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.920423 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.920457 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.920465 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.920479 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.920490 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:56Z","lastTransitionTime":"2025-10-02T09:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.960607 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" event={"ID":"c3945e65-623a-40eb-82ae-8b17bcf8827b","Type":"ContainerStarted","Data":"497de5e5e05ba246ef4a053b62e09441e5e43342b0a61f446d7dabe2066b40b5"} Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.965012 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" event={"ID":"a900403a-9c51-4ca5-862f-2fbbd30cbcc3","Type":"ContainerStarted","Data":"7fcb0a7df941cfb36989037ad56c1e860f3d9252b30b0d5b327b409aebdcf815"} Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.965284 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.976155 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://562dcc33d7454113e3dec145a8b2faba06d442892ac65301a5713c82e3ca73e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:56Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.991118 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:56 crc kubenswrapper[4722]: I1002 09:32:56.991204 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8589e9ec-e8ad-4b32-8846-841b689971b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf821bcfda8c6b7f9823ecb4a9dc7844eae7b1005d41e7bfe9b47df5d66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664f4e51f677aedd16cbcaed322e25d228c87936e30996f36dd3a5b97a43def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14f04556a2e11720ac3e92ec6f1c436468001e87b2984d007f9510a0e6ff0c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546efffc2572c644b7c949abadbaa2f8c9cc0449e08bb5615d3de0504f1bb9e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:56Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.001601 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:57Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.013788 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:57Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.022304 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.022360 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.022371 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.022387 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.022401 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:57Z","lastTransitionTime":"2025-10-02T09:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.030507 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jlvpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:57Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.043788 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3945e65-623a-40eb-82ae-8b17bcf8827b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497de5e5e05ba246ef4a053b62e09441e5e43342b0a61f446d7dabe2066b40b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cc2g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:57Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.056409 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://059f5c846149b9005d20124d666d2964bccada1c171cb57e17c64555147b0ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ee5e5f4f56ae11d8f714c42b1c1e6044b04fa5cf4c353d6bfd08c9f3b9e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:57Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.067528 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a27c33baca44df058fd006a35ff882b94f64446256b26a7bcde4c22671a5ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:57Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.080368 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:57Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.093627 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f662826c-e772-4f56-a85f-83971aaef356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b262eb3201301d5e7cc940314a920dd22bcac685c0d91f412f27647028c7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc66c4529a0875a210fbc2b5af96db36ab208d651068958075de46898c1e150\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q6h76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:57Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.104557 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tsdt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb2b1d65-2472-468b-8995-eb1943a63a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e98c56ccc33119647035cca69beb3a8def018e7bb0cae463c6910e18e0f64a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tsdt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:57Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.113750 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hm66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9384fcdadf3735b0de65edc5af7e3d88b55d03043291aea0bc8cc8de945e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7k768\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hm66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:57Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.124901 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zwlns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04c84986-75a1-4683-8764-8f7307d1126a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460204a345893172e65dd3415be30a96dd48d7bce44d05971133cde3e43939af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghrmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zwlns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:57Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.126326 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.126354 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.126365 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.126382 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.126403 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:57Z","lastTransitionTime":"2025-10-02T09:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.150681 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a56e3a0-fea1-4ca5-9d2a-b0a28e16d684\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://737b3f37862e455f181b6864cab54bb6dae5c458ed9dcb9b7fda63a06d0f17fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb6cb31ce28cc0d6095e7f27187bdeb1c2ee2d3f19ac480c95ab8116bd776bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01dd91984bea279379c038d2fbb4a4ad45200b49443071e290c0bf6854af1ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 09:32:46.368533 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 09:32:46.368793 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 09:32:46.369600 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1554788403/tls.crt::/tmp/serving-cert-1554788403/tls.key\\\\\\\"\\\\nI1002 09:32:46.682405 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 09:32:46.685049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 09:32:46.685073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 09:32:46.685132 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 09:32:46.685139 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 09:32:46.690261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 09:32:46.690286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 09:32:46.690299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 09:32:46.690303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 09:32:46.690306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 09:32:46.690478 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 09:32:46.692277 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3b1c06b43a4fb138588fe71ad3f37dae41d5de647684ab490a2f4079018ac06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:57Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.167086 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://562dcc33d7454113e3dec145a8b2faba06d442892ac65301a5713c82e3ca73e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:57Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.182745 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8589e9ec-e8ad-4b32-8846-841b689971b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf821bcfda8c6b7f9823ecb4a9dc7844eae7b1005d41e7bfe9b47df5d66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664f4e51f677aedd16cbcaed322e25d228c87936e30996f36dd3a5b97a43def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14f04556a2e11720ac3e92ec6f1c436468001e87b2984d007f9510a0e6ff0c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546efffc2572c644b7c949abadbaa2f8c9cc0449e08bb5615d3de0504f1bb9e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:57Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.198005 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:57Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.211703 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:57Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.228729 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcb0a7df941cfb36989037ad56c1e860f3d9252b30b0d5b327b409aebdcf815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jlvpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:57Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.229247 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.229278 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.229289 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.229311 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.229342 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:57Z","lastTransitionTime":"2025-10-02T09:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.245560 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3945e65-623a-40eb-82ae-8b17bcf8827b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497de5e5e05ba246ef4a053b62e09441e5e43342b0a61f446d7dabe2066b40b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cc2g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:57Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.259540 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://059f5c846149b9005d20124d666d2964bccada1c171cb57e17c64555147b0ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ee5e5f4f56ae11d8f714c42b1c1e6044b04fa5cf4c353d6bfd08c9f3b9e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:57Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.273201 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a27c33baca44df058fd006a35ff882b94f64446256b26a7bcde4c22671a5ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:57Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.289592 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:57Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.304984 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f662826c-e772-4f56-a85f-83971aaef356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b262eb3201301d5e7cc940314a920dd22bcac685c0d91f412f27647028c7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc66c4529a0875a210fbc2b5af96db36ab208d651068958075de46898c1e150\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q6h76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:57Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.317885 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tsdt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb2b1d65-2472-468b-8995-eb1943a63a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e98c56ccc33119647035cca69beb3a8def018e7bb0cae463c6910e18e0f64a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tsdt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:57Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.329377 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hm66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9384fcdadf3735b0de65edc5af7e3d88b55d03043291aea0bc8cc8de945e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7k768\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hm66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:57Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.331605 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.331641 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.331655 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.331672 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.331686 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:57Z","lastTransitionTime":"2025-10-02T09:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.343478 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zwlns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04c84986-75a1-4683-8764-8f7307d1126a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460204a345893172e65dd3415be30a96dd48d7bce44d05971133cde3e43939af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghrmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zwlns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:57Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.356480 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a56e3a0-fea1-4ca5-9d2a-b0a28e16d684\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://737b3f37862e455f181b6864cab54bb6dae5c458ed9dcb9b7fda63a06d0f17fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb6cb31ce28cc0d6095e7f27187bdeb1c2ee2d3f19ac480c95ab8116bd776bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01dd91984bea279379c038d2fbb4a4ad45200b49443071e290c0bf6854af1ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 09:32:46.368533 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 09:32:46.368793 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 09:32:46.369600 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1554788403/tls.crt::/tmp/serving-cert-1554788403/tls.key\\\\\\\"\\\\nI1002 09:32:46.682405 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 09:32:46.685049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 09:32:46.685073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 09:32:46.685132 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 09:32:46.685139 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 09:32:46.690261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 09:32:46.690286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 09:32:46.690299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 09:32:46.690303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 09:32:46.690306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 09:32:46.690478 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 09:32:46.692277 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3b1c06b43a4fb138588fe71ad3f37dae41d5de647684ab490a2f4079018ac06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:57Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.434832 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.434888 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.434901 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.434921 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.434937 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:57Z","lastTransitionTime":"2025-10-02T09:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.537923 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.537986 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.537998 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.538020 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.538033 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:57Z","lastTransitionTime":"2025-10-02T09:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.640153 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.640222 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.640232 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.640253 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.640265 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:57Z","lastTransitionTime":"2025-10-02T09:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.743645 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.743711 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.743733 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.743768 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.743789 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:57Z","lastTransitionTime":"2025-10-02T09:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.846686 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.846743 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.846760 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.846785 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.846803 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:57Z","lastTransitionTime":"2025-10-02T09:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.950014 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.950085 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.950099 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.950126 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.950149 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:57Z","lastTransitionTime":"2025-10-02T09:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.968916 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.969654 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:57 crc kubenswrapper[4722]: I1002 09:32:57.997931 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.012003 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8589e9ec-e8ad-4b32-8846-841b689971b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf821bcfda8c6b7f9823ecb4a9dc7844eae7b1005d41e7bfe9b47df5d66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664f4e51f677aedd16cbcaed322e25d228c87936e30996f36dd3a5b97a43def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14f04556a2e11720ac3e92ec6f1c436468001e87b2984d007f9510a0e6ff0c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546efffc2572c644b7c949abadbaa2f8c9cc0449e08bb5615d3de0504f1bb9e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:58Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.031694 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://562dcc33d7454113e3dec145a8b2faba06d442892ac65301a5713c82e3ca73e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:58Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.047043 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:58Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.052568 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.052597 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.052607 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.052623 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.052635 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:58Z","lastTransitionTime":"2025-10-02T09:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.065649 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:58Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.094268 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcb0a7df941cfb36989037ad56c1e860f3d9252b30b0d5b327b409aebdcf815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jlvpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:58Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.113449 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3945e65-623a-40eb-82ae-8b17bcf8827b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497de5e5e05ba246ef4a053b62e09441e5e43342b0a61f446d7dabe2066b40b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cc2g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:58Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.133152 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://059f5c846149b9005d20124d666d2964bccada1c171cb57e17c64555147b0ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ee5e5f4f56ae11d8f714c42b1c1e6044b04fa5cf4c353d6bfd08c9f3b9e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:58Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.147125 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a27c33baca44df058fd006a35ff882b94f64446256b26a7bcde4c22671a5ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:58Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.156417 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.156508 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.156557 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.156589 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.156606 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:58Z","lastTransitionTime":"2025-10-02T09:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.166903 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:58Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.190188 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f662826c-e772-4f56-a85f-83971aaef356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b262eb3201301d5e7cc940314a920dd22bcac685c0d91f412f27647028c7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc66c4529a0875a210fbc2b5af96db36ab208d651068958075de46898c1e150\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q6h76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:58Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.199353 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tsdt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb2b1d65-2472-468b-8995-eb1943a63a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e98c56ccc33119647035cca69beb3a8def018e7bb0cae463c6910e18e0f64a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tsdt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:58Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.213139 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a56e3a0-fea1-4ca5-9d2a-b0a28e16d684\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://737b3f37862e455f181b6864cab54bb6dae5c458ed9dcb9b7fda63a06d0f17fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb6cb31ce28cc0d6095e7f27187bdeb1c2ee2d3f19ac480c95ab8116bd776bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01dd91984bea279379c038d2fbb4a4ad45200b49443071e290c0bf6854af1ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 09:32:46.368533 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 09:32:46.368793 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 09:32:46.369600 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1554788403/tls.crt::/tmp/serving-cert-1554788403/tls.key\\\\\\\"\\\\nI1002 09:32:46.682405 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 09:32:46.685049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 09:32:46.685073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 09:32:46.685132 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 09:32:46.685139 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 09:32:46.690261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 09:32:46.690286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 09:32:46.690299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 09:32:46.690303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 09:32:46.690306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 09:32:46.690478 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 09:32:46.692277 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3b1c06b43a4fb138588fe71ad3f37dae41d5de647684ab490a2f4079018ac06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:58Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.225330 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hm66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9384fcdadf3735b0de65edc5af7e3d88b55d03043291aea0bc8cc8de945e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7k768\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hm66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:58Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.239684 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zwlns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04c84986-75a1-4683-8764-8f7307d1126a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460204a345893172e65dd3415be30a96dd48d7bce44d05971133cde3e43939af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghrmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zwlns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:58Z is after 2025-08-24T17:21:41Z" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.260018 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.260056 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.260066 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.260084 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.260113 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:58Z","lastTransitionTime":"2025-10-02T09:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.363586 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.363656 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.363679 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.363730 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.363743 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:58Z","lastTransitionTime":"2025-10-02T09:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.472306 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.472375 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.472383 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.472401 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.472412 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:58Z","lastTransitionTime":"2025-10-02T09:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.575282 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.575377 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.575395 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.575424 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.575442 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:58Z","lastTransitionTime":"2025-10-02T09:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.678424 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.678485 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.678499 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.678522 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.678535 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:58Z","lastTransitionTime":"2025-10-02T09:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.710238 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.710361 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.710427 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:32:58 crc kubenswrapper[4722]: E1002 09:32:58.710480 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:32:58 crc kubenswrapper[4722]: E1002 09:32:58.710602 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:32:58 crc kubenswrapper[4722]: E1002 09:32:58.710760 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.781519 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.781600 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.781613 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.781630 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.781641 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:58Z","lastTransitionTime":"2025-10-02T09:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.884442 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.884485 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.884493 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.884508 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.884520 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:58Z","lastTransitionTime":"2025-10-02T09:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.971965 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.987488 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.987543 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.987567 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.987587 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:58 crc kubenswrapper[4722]: I1002 09:32:58.987604 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:58Z","lastTransitionTime":"2025-10-02T09:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.089961 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.090004 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.090015 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.090035 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.090049 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:59Z","lastTransitionTime":"2025-10-02T09:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.192464 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.192539 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.192565 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.192603 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.192631 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:59Z","lastTransitionTime":"2025-10-02T09:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.296689 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.296755 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.296771 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.296795 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.296812 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:59Z","lastTransitionTime":"2025-10-02T09:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.399373 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.399429 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.399445 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.399508 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.399524 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:59Z","lastTransitionTime":"2025-10-02T09:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.502015 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.502057 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.502074 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.502093 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.502105 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:59Z","lastTransitionTime":"2025-10-02T09:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.603957 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.604001 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.604011 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.604026 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.604037 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:59Z","lastTransitionTime":"2025-10-02T09:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.707182 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.707233 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.707244 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.707262 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.707274 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:59Z","lastTransitionTime":"2025-10-02T09:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.810661 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.810763 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.810870 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.810917 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.810940 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:59Z","lastTransitionTime":"2025-10-02T09:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.913568 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.913615 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.913623 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.913639 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.913648 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:32:59Z","lastTransitionTime":"2025-10-02T09:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.978257 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jlvpl_a900403a-9c51-4ca5-862f-2fbbd30cbcc3/ovnkube-controller/0.log" Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.980266 4722 generic.go:334] "Generic (PLEG): container finished" podID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerID="7fcb0a7df941cfb36989037ad56c1e860f3d9252b30b0d5b327b409aebdcf815" exitCode=1 Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.980334 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" event={"ID":"a900403a-9c51-4ca5-862f-2fbbd30cbcc3","Type":"ContainerDied","Data":"7fcb0a7df941cfb36989037ad56c1e860f3d9252b30b0d5b327b409aebdcf815"} Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.980962 4722 scope.go:117] "RemoveContainer" containerID="7fcb0a7df941cfb36989037ad56c1e860f3d9252b30b0d5b327b409aebdcf815" Oct 02 09:32:59 crc kubenswrapper[4722]: I1002 09:32:59.996937 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:32:59Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.011055 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:00Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.016509 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.016553 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.016565 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.016583 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.016597 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:00Z","lastTransitionTime":"2025-10-02T09:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.030856 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcb0a7df941cfb36989037ad56c1e860f3d9252b30b0d5b327b409aebdcf815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fcb0a7df941cfb36989037ad56c1e860f3d9252b30b0d5b327b409aebdcf815\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T09:32:59Z\\\",\\\"message\\\":\\\"9:32:59.052045 6001 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1002 09:32:59.054089 6001 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1002 09:32:59.054113 6001 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1002 09:32:59.054262 6001 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 09:32:59.054373 6001 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 09:32:59.054586 6001 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 09:32:59.054673 6001 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 09:32:59.054780 6001 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1002 09:32:59.054876 6001 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 09:32:59.054956 6001 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1002 09:32:59.055160 6001 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1002 09:32:59.055165 6001 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1002 09:32:59.055189 6001 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1002 09:32:59.055234 6001 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1002 09:32:59.055303 6001 factory.go:656] Stopping watch factory\\\\nI1002 09:32:59.055413 6001 ovnkube.go:599] Stopped ovnkube\\\\nI1002 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jlvpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:00Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.049478 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3945e65-623a-40eb-82ae-8b17bcf8827b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497de5e5e05ba246ef4a053b62e09441e5e43342b0a61f446d7dabe2066b40b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cc2g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:00Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.066445 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://059f5c846149b9005d20124d666d2964bccada1c171cb57e17c64555147b0ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ee5e5f4f56ae11d8f714c42b1c1e6044b04fa5cf4c353d6bfd08c9f3b9e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:00Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.079683 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a27c33baca44df058fd006a35ff882b94f64446256b26a7bcde4c22671a5ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:00Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.093482 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:00Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.111272 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f662826c-e772-4f56-a85f-83971aaef356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b262eb3201301d5e7cc940314a920dd22bcac685c0d91f412f27647028c7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc66c4529a0875a210fbc2b5af96db36ab208d651068958075de46898c1e150\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q6h76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:00Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.119375 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.119508 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.119524 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.119596 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.119611 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:00Z","lastTransitionTime":"2025-10-02T09:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.124703 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tsdt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb2b1d65-2472-468b-8995-eb1943a63a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e98c56ccc33119647035cca69beb3a8def018e7bb0cae463c6910e18e0f64a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tsdt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:00Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.162906 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a56e3a0-fea1-4ca5-9d2a-b0a28e16d684\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://737b3f37862e455f181b6864cab54bb6dae5c458ed9dcb9b7fda63a06d0f17fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb6cb31ce28cc0d6095e7f27187bdeb1c2ee2d3f19ac480c95ab8116bd776bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01dd91984bea279379c038d2fbb4a4ad45200b49443071e290c0bf6854af1ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 09:32:46.368533 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 09:32:46.368793 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 09:32:46.369600 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1554788403/tls.crt::/tmp/serving-cert-1554788403/tls.key\\\\\\\"\\\\nI1002 09:32:46.682405 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 09:32:46.685049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 09:32:46.685073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 09:32:46.685132 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 09:32:46.685139 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 09:32:46.690261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 09:32:46.690286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 09:32:46.690299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 09:32:46.690303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 09:32:46.690306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 09:32:46.690478 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 09:32:46.692277 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3b1c06b43a4fb138588fe71ad3f37dae41d5de647684ab490a2f4079018ac06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:00Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.183525 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hm66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9384fcdadf3735b0de65edc5af7e3d88b55d03043291aea0bc8cc8de945e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7k768\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hm66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:00Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.211007 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zwlns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04c84986-75a1-4683-8764-8f7307d1126a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460204a345893172e65dd3415be30a96dd48d7bce44d05971133cde3e43939af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghrmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zwlns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:00Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.221902 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.221941 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.221952 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.221967 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.221978 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:00Z","lastTransitionTime":"2025-10-02T09:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.224975 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8589e9ec-e8ad-4b32-8846-841b689971b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf821bcfda8c6b7f9823ecb4a9dc7844eae7b1005d41e7bfe9b47df5d66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664f4e51f677aedd16cbcaed322e25d228c87936e30996f36dd3a5b97a43def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14f04556a2e11720ac3e92ec6f1c436468001e87b2984d007f9510a0e6ff0c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546efffc2572c644b7c949abadbaa2f8c9cc0449e08bb5615d3de0504f1bb9e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:00Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.237034 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://562dcc33d7454113e3dec145a8b2faba06d442892ac65301a5713c82e3ca73e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:00Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.324822 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.324884 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.324903 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.324928 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.324946 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:00Z","lastTransitionTime":"2025-10-02T09:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.427585 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.427637 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.427648 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.427668 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.427680 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:00Z","lastTransitionTime":"2025-10-02T09:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.531434 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.531498 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.531512 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.531528 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.531539 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:00Z","lastTransitionTime":"2025-10-02T09:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.634355 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.634413 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.634425 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.634446 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.634463 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:00Z","lastTransitionTime":"2025-10-02T09:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.736793 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.736799 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:33:00 crc kubenswrapper[4722]: E1002 09:33:00.737097 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:33:00 crc kubenswrapper[4722]: E1002 09:33:00.737200 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.737632 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:33:00 crc kubenswrapper[4722]: E1002 09:33:00.737918 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.738017 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.738140 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.738172 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.738193 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.738207 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:00Z","lastTransitionTime":"2025-10-02T09:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.840781 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.840826 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.840842 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.840858 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.840869 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:00Z","lastTransitionTime":"2025-10-02T09:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.943962 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.944007 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.944021 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.944039 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.944051 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:00Z","lastTransitionTime":"2025-10-02T09:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.986831 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jlvpl_a900403a-9c51-4ca5-862f-2fbbd30cbcc3/ovnkube-controller/1.log" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.988038 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jlvpl_a900403a-9c51-4ca5-862f-2fbbd30cbcc3/ovnkube-controller/0.log" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.991677 4722 generic.go:334] "Generic (PLEG): container finished" podID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerID="a3b8bb2b4eff73f696db5caf412dec0557cd23595cfd096e941280525aa683ae" exitCode=1 Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.991749 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" event={"ID":"a900403a-9c51-4ca5-862f-2fbbd30cbcc3","Type":"ContainerDied","Data":"a3b8bb2b4eff73f696db5caf412dec0557cd23595cfd096e941280525aa683ae"} Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.991852 4722 scope.go:117] "RemoveContainer" containerID="7fcb0a7df941cfb36989037ad56c1e860f3d9252b30b0d5b327b409aebdcf815" Oct 02 09:33:00 crc kubenswrapper[4722]: I1002 09:33:00.993119 4722 scope.go:117] "RemoveContainer" containerID="a3b8bb2b4eff73f696db5caf412dec0557cd23595cfd096e941280525aa683ae" Oct 02 09:33:00 crc kubenswrapper[4722]: E1002 09:33:00.993541 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-jlvpl_openshift-ovn-kubernetes(a900403a-9c51-4ca5-862f-2fbbd30cbcc3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.016100 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hm66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9384fcdadf3735b0de65edc5af7e3d88b55d03043291aea0bc8cc8de945e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7k768\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hm66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:01Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.039369 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zwlns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04c84986-75a1-4683-8764-8f7307d1126a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460204a345893172e65dd3415be30a96dd48d7bce44d05971133cde3e43939af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghrmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zwlns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:01Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.046013 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.046065 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.046083 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.046105 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.046118 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:01Z","lastTransitionTime":"2025-10-02T09:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.056168 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a56e3a0-fea1-4ca5-9d2a-b0a28e16d684\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://737b3f37862e455f181b6864cab54bb6dae5c458ed9dcb9b7fda63a06d0f17fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb6cb31ce28cc0d6095e7f27187bdeb1c2ee2d3f19ac480c95ab8116bd776bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01dd91984bea279379c038d2fbb4a4ad45200b49443071e290c0bf6854af1ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 09:32:46.368533 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 09:32:46.368793 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 09:32:46.369600 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1554788403/tls.crt::/tmp/serving-cert-1554788403/tls.key\\\\\\\"\\\\nI1002 09:32:46.682405 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 09:32:46.685049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 09:32:46.685073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 09:32:46.685132 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 09:32:46.685139 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 09:32:46.690261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 09:32:46.690286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 09:32:46.690299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 09:32:46.690303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 09:32:46.690306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 09:32:46.690478 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 09:32:46.692277 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3b1c06b43a4fb138588fe71ad3f37dae41d5de647684ab490a2f4079018ac06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:01Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.075359 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://562dcc33d7454113e3dec145a8b2faba06d442892ac65301a5713c82e3ca73e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:01Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.091344 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8589e9ec-e8ad-4b32-8846-841b689971b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf821bcfda8c6b7f9823ecb4a9dc7844eae7b1005d41e7bfe9b47df5d66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664f4e51f677aedd16cbcaed322e25d228c87936e30996f36dd3a5b97a43def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14f04556a2e11720ac3e92ec6f1c436468001e87b2984d007f9510a0e6ff0c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546efffc2572c644b7c949abadbaa2f8c9cc0449e08bb5615d3de0504f1bb9e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:01Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.106279 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:01Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.123114 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:01Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.145917 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3b8bb2b4eff73f696db5caf412dec0557cd23595cfd096e941280525aa683ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fcb0a7df941cfb36989037ad56c1e860f3d9252b30b0d5b327b409aebdcf815\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T09:32:59Z\\\",\\\"message\\\":\\\"9:32:59.052045 6001 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1002 09:32:59.054089 6001 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1002 09:32:59.054113 6001 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1002 09:32:59.054262 6001 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 09:32:59.054373 6001 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 09:32:59.054586 6001 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 09:32:59.054673 6001 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 09:32:59.054780 6001 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1002 09:32:59.054876 6001 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 09:32:59.054956 6001 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1002 09:32:59.055160 6001 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1002 09:32:59.055165 6001 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1002 09:32:59.055189 6001 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1002 09:32:59.055234 6001 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1002 09:32:59.055303 6001 factory.go:656] Stopping watch factory\\\\nI1002 09:32:59.055413 6001 ovnkube.go:599] Stopped ovnkube\\\\nI1002 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3b8bb2b4eff73f696db5caf412dec0557cd23595cfd096e941280525aa683ae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T09:33:00Z\\\",\\\"message\\\":\\\"ver/kube-apiserver-crc\\\\nI1002 09:33:00.844178 6127 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nF1002 09:33:00.844180 6127 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:00Z is after 2025-08-24T17:21:41Z]\\\\nI1002 09:33:00.844173 6127 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns-operator/metrics]} name:Service_openshift-dns-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:33:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jlvpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:01Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.149033 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.149094 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.149107 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.149123 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.149137 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:01Z","lastTransitionTime":"2025-10-02T09:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.162464 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3945e65-623a-40eb-82ae-8b17bcf8827b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497de5e5e05ba246ef4a053b62e09441e5e43342b0a61f446d7dabe2066b40b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cc2g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:01Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.175439 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://059f5c846149b9005d20124d666d2964bccada1c171cb57e17c64555147b0ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ee5e5f4f56ae11d8f714c42b1c1e6044b04fa5cf4c353d6bfd08c9f3b9e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:01Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.180968 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sjg5q"] Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.181688 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sjg5q" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.184138 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.185723 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.189026 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a27c33baca44df058fd006a35ff882b94f64446256b26a7bcde4c22671a5ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:01Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.204543 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:01Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.217739 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f662826c-e772-4f56-a85f-83971aaef356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b262eb3201301d5e7cc940314a920dd22bcac685c0d91f412f27647028c7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc66c4529a0875a210fbc2b5af96db36ab208d651068958075de46898c1e150\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q6h76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:01Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.234079 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tsdt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb2b1d65-2472-468b-8995-eb1943a63a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e98c56ccc33119647035cca69beb3a8def018e7bb0cae463c6910e18e0f64a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tsdt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:01Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.242399 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bc362a8f-ced8-4af3-b908-f856e59845fe-env-overrides\") pod \"ovnkube-control-plane-749d76644c-sjg5q\" (UID: \"bc362a8f-ced8-4af3-b908-f856e59845fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sjg5q" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.242513 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpwr9\" (UniqueName: \"kubernetes.io/projected/bc362a8f-ced8-4af3-b908-f856e59845fe-kube-api-access-qpwr9\") pod \"ovnkube-control-plane-749d76644c-sjg5q\" (UID: \"bc362a8f-ced8-4af3-b908-f856e59845fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sjg5q" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.242745 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bc362a8f-ced8-4af3-b908-f856e59845fe-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-sjg5q\" (UID: \"bc362a8f-ced8-4af3-b908-f856e59845fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sjg5q" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.242845 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bc362a8f-ced8-4af3-b908-f856e59845fe-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-sjg5q\" (UID: \"bc362a8f-ced8-4af3-b908-f856e59845fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sjg5q" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.249628 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:01Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.251773 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.251836 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.251856 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.251883 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.251905 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:01Z","lastTransitionTime":"2025-10-02T09:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.263475 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:01Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.282772 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3b8bb2b4eff73f696db5caf412dec0557cd23595cfd096e941280525aa683ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fcb0a7df941cfb36989037ad56c1e860f3d9252b30b0d5b327b409aebdcf815\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T09:32:59Z\\\",\\\"message\\\":\\\"9:32:59.052045 6001 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1002 09:32:59.054089 6001 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1002 09:32:59.054113 6001 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1002 09:32:59.054262 6001 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 09:32:59.054373 6001 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 09:32:59.054586 6001 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 09:32:59.054673 6001 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 09:32:59.054780 6001 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1002 09:32:59.054876 6001 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 09:32:59.054956 6001 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1002 09:32:59.055160 6001 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1002 09:32:59.055165 6001 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1002 09:32:59.055189 6001 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1002 09:32:59.055234 6001 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1002 09:32:59.055303 6001 factory.go:656] Stopping watch factory\\\\nI1002 09:32:59.055413 6001 ovnkube.go:599] Stopped ovnkube\\\\nI1002 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3b8bb2b4eff73f696db5caf412dec0557cd23595cfd096e941280525aa683ae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T09:33:00Z\\\",\\\"message\\\":\\\"ver/kube-apiserver-crc\\\\nI1002 09:33:00.844178 6127 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nF1002 09:33:00.844180 6127 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:00Z is after 2025-08-24T17:21:41Z]\\\\nI1002 09:33:00.844173 6127 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns-operator/metrics]} name:Service_openshift-dns-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:33:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jlvpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:01Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.302957 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3945e65-623a-40eb-82ae-8b17bcf8827b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497de5e5e05ba246ef4a053b62e09441e5e43342b0a61f446d7dabe2066b40b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cc2g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:01Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.318906 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f662826c-e772-4f56-a85f-83971aaef356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b262eb3201301d5e7cc940314a920dd22bcac685c0d91f412f27647028c7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc66c4529a0875a210fbc2b5af96db36ab208d651068958075de46898c1e150\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q6h76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:01Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.334606 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tsdt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb2b1d65-2472-468b-8995-eb1943a63a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e98c56ccc33119647035cca69beb3a8def018e7bb0cae463c6910e18e0f64a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tsdt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:01Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.343893 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bc362a8f-ced8-4af3-b908-f856e59845fe-env-overrides\") pod \"ovnkube-control-plane-749d76644c-sjg5q\" (UID: \"bc362a8f-ced8-4af3-b908-f856e59845fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sjg5q" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.343940 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpwr9\" (UniqueName: \"kubernetes.io/projected/bc362a8f-ced8-4af3-b908-f856e59845fe-kube-api-access-qpwr9\") pod \"ovnkube-control-plane-749d76644c-sjg5q\" (UID: \"bc362a8f-ced8-4af3-b908-f856e59845fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sjg5q" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.343980 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bc362a8f-ced8-4af3-b908-f856e59845fe-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-sjg5q\" (UID: \"bc362a8f-ced8-4af3-b908-f856e59845fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sjg5q" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.344074 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bc362a8f-ced8-4af3-b908-f856e59845fe-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-sjg5q\" (UID: \"bc362a8f-ced8-4af3-b908-f856e59845fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sjg5q" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.344923 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bc362a8f-ced8-4af3-b908-f856e59845fe-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-sjg5q\" (UID: \"bc362a8f-ced8-4af3-b908-f856e59845fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sjg5q" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.345534 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bc362a8f-ced8-4af3-b908-f856e59845fe-env-overrides\") pod \"ovnkube-control-plane-749d76644c-sjg5q\" (UID: \"bc362a8f-ced8-4af3-b908-f856e59845fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sjg5q" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.360845 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bc362a8f-ced8-4af3-b908-f856e59845fe-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-sjg5q\" (UID: \"bc362a8f-ced8-4af3-b908-f856e59845fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sjg5q" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.362820 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.362855 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.362864 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.362881 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.362891 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:01Z","lastTransitionTime":"2025-10-02T09:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.363452 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://059f5c846149b9005d20124d666d2964bccada1c171cb57e17c64555147b0ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ee5e5f4f56ae11d8f714c42b1c1e6044b04fa5cf4c353d6bfd08c9f3b9e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:01Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.364051 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpwr9\" (UniqueName: \"kubernetes.io/projected/bc362a8f-ced8-4af3-b908-f856e59845fe-kube-api-access-qpwr9\") pod \"ovnkube-control-plane-749d76644c-sjg5q\" (UID: \"bc362a8f-ced8-4af3-b908-f856e59845fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sjg5q" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.381597 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a27c33baca44df058fd006a35ff882b94f64446256b26a7bcde4c22671a5ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:01Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.399653 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:01Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.419777 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a56e3a0-fea1-4ca5-9d2a-b0a28e16d684\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://737b3f37862e455f181b6864cab54bb6dae5c458ed9dcb9b7fda63a06d0f17fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb6cb31ce28cc0d6095e7f27187bdeb1c2ee2d3f19ac480c95ab8116bd776bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01dd91984bea279379c038d2fbb4a4ad45200b49443071e290c0bf6854af1ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 09:32:46.368533 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 09:32:46.368793 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 09:32:46.369600 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1554788403/tls.crt::/tmp/serving-cert-1554788403/tls.key\\\\\\\"\\\\nI1002 09:32:46.682405 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 09:32:46.685049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 09:32:46.685073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 09:32:46.685132 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 09:32:46.685139 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 09:32:46.690261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 09:32:46.690286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 09:32:46.690299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 09:32:46.690303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 09:32:46.690306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 09:32:46.690478 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 09:32:46.692277 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3b1c06b43a4fb138588fe71ad3f37dae41d5de647684ab490a2f4079018ac06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:01Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.435340 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hm66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9384fcdadf3735b0de65edc5af7e3d88b55d03043291aea0bc8cc8de945e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7k768\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hm66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:01Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.451564 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zwlns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04c84986-75a1-4683-8764-8f7307d1126a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460204a345893172e65dd3415be30a96dd48d7bce44d05971133cde3e43939af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghrmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zwlns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:01Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.466494 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.466815 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.466929 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.467055 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.467130 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:01Z","lastTransitionTime":"2025-10-02T09:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.467730 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8589e9ec-e8ad-4b32-8846-841b689971b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf821bcfda8c6b7f9823ecb4a9dc7844eae7b1005d41e7bfe9b47df5d66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664f4e51f677aedd16cbcaed322e25d228c87936e30996f36dd3a5b97a43def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14f04556a2e11720ac3e92ec6f1c436468001e87b2984d007f9510a0e6ff0c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546efffc2572c644b7c949abadbaa2f8c9cc0449e08bb5615d3de0504f1bb9e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:01Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.481466 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://562dcc33d7454113e3dec145a8b2faba06d442892ac65301a5713c82e3ca73e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:01Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.494189 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sjg5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc362a8f-ced8-4af3-b908-f856e59845fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:33:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sjg5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:01Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.494545 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sjg5q" Oct 02 09:33:01 crc kubenswrapper[4722]: W1002 09:33:01.522032 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc362a8f_ced8_4af3_b908_f856e59845fe.slice/crio-2bf93e7b1bb81c79e11e65ab49ae03cc97762d4dc5dff6af4c186e334c32671c WatchSource:0}: Error finding container 2bf93e7b1bb81c79e11e65ab49ae03cc97762d4dc5dff6af4c186e334c32671c: Status 404 returned error can't find the container with id 2bf93e7b1bb81c79e11e65ab49ae03cc97762d4dc5dff6af4c186e334c32671c Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.570179 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.570233 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.570246 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.570268 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.570282 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:01Z","lastTransitionTime":"2025-10-02T09:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.672983 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.673026 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.673035 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.673053 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.673063 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:01Z","lastTransitionTime":"2025-10-02T09:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.832879 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.832928 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.832937 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.832955 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.832967 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:01Z","lastTransitionTime":"2025-10-02T09:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.936129 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.936185 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.936196 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.936219 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.936235 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:01Z","lastTransitionTime":"2025-10-02T09:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:01 crc kubenswrapper[4722]: I1002 09:33:01.998457 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jlvpl_a900403a-9c51-4ca5-862f-2fbbd30cbcc3/ovnkube-controller/1.log" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.002207 4722 scope.go:117] "RemoveContainer" containerID="a3b8bb2b4eff73f696db5caf412dec0557cd23595cfd096e941280525aa683ae" Oct 02 09:33:02 crc kubenswrapper[4722]: E1002 09:33:02.002386 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-jlvpl_openshift-ovn-kubernetes(a900403a-9c51-4ca5-862f-2fbbd30cbcc3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.002782 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sjg5q" event={"ID":"bc362a8f-ced8-4af3-b908-f856e59845fe","Type":"ContainerStarted","Data":"2bf93e7b1bb81c79e11e65ab49ae03cc97762d4dc5dff6af4c186e334c32671c"} Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.019140 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3945e65-623a-40eb-82ae-8b17bcf8827b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497de5e5e05ba246ef4a053b62e09441e5e43342b0a61f446d7dabe2066b40b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cc2g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:02Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.031444 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:02Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.039097 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.039161 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.039180 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.039207 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.039226 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:02Z","lastTransitionTime":"2025-10-02T09:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.045853 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:02Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.064133 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3b8bb2b4eff73f696db5caf412dec0557cd23595cfd096e941280525aa683ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3b8bb2b4eff73f696db5caf412dec0557cd23595cfd096e941280525aa683ae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T09:33:00Z\\\",\\\"message\\\":\\\"ver/kube-apiserver-crc\\\\nI1002 09:33:00.844178 6127 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nF1002 09:33:00.844180 6127 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:00Z is after 2025-08-24T17:21:41Z]\\\\nI1002 09:33:00.844173 6127 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns-operator/metrics]} name:Service_openshift-dns-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:33:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-jlvpl_openshift-ovn-kubernetes(a900403a-9c51-4ca5-862f-2fbbd30cbcc3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jlvpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:02Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.077300 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:02Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.090397 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f662826c-e772-4f56-a85f-83971aaef356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b262eb3201301d5e7cc940314a920dd22bcac685c0d91f412f27647028c7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc66c4529a0875a210fbc2b5af96db36ab208d651068958075de46898c1e150\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q6h76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:02Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.101392 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tsdt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb2b1d65-2472-468b-8995-eb1943a63a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e98c56ccc33119647035cca69beb3a8def018e7bb0cae463c6910e18e0f64a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tsdt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:02Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.114057 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://059f5c846149b9005d20124d666d2964bccada1c171cb57e17c64555147b0ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ee5e5f4f56ae11d8f714c42b1c1e6044b04fa5cf4c353d6bfd08c9f3b9e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:02Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.126787 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a27c33baca44df058fd006a35ff882b94f64446256b26a7bcde4c22671a5ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:02Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.143457 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.143519 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.143568 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.143602 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.143620 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:02Z","lastTransitionTime":"2025-10-02T09:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.145617 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a56e3a0-fea1-4ca5-9d2a-b0a28e16d684\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://737b3f37862e455f181b6864cab54bb6dae5c458ed9dcb9b7fda63a06d0f17fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb6cb31ce28cc0d6095e7f27187bdeb1c2ee2d3f19ac480c95ab8116bd776bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01dd91984bea279379c038d2fbb4a4ad45200b49443071e290c0bf6854af1ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 09:32:46.368533 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 09:32:46.368793 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 09:32:46.369600 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1554788403/tls.crt::/tmp/serving-cert-1554788403/tls.key\\\\\\\"\\\\nI1002 09:32:46.682405 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 09:32:46.685049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 09:32:46.685073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 09:32:46.685132 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 09:32:46.685139 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 09:32:46.690261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 09:32:46.690286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 09:32:46.690299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 09:32:46.690303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 09:32:46.690306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 09:32:46.690478 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 09:32:46.692277 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3b1c06b43a4fb138588fe71ad3f37dae41d5de647684ab490a2f4079018ac06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:02Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.156138 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hm66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9384fcdadf3735b0de65edc5af7e3d88b55d03043291aea0bc8cc8de945e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7k768\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hm66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:02Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.169291 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zwlns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04c84986-75a1-4683-8764-8f7307d1126a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460204a345893172e65dd3415be30a96dd48d7bce44d05971133cde3e43939af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghrmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zwlns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:02Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.180153 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8589e9ec-e8ad-4b32-8846-841b689971b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf821bcfda8c6b7f9823ecb4a9dc7844eae7b1005d41e7bfe9b47df5d66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664f4e51f677aedd16cbcaed322e25d228c87936e30996f36dd3a5b97a43def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14f04556a2e11720ac3e92ec6f1c436468001e87b2984d007f9510a0e6ff0c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546efffc2572c644b7c949abadbaa2f8c9cc0449e08bb5615d3de0504f1bb9e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:02Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.192083 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://562dcc33d7454113e3dec145a8b2faba06d442892ac65301a5713c82e3ca73e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:02Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.203864 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sjg5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc362a8f-ced8-4af3-b908-f856e59845fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:33:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sjg5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:02Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.246008 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.246059 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.246069 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.246083 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.246095 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:02Z","lastTransitionTime":"2025-10-02T09:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.306510 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-jn9jw"] Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.307011 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:33:02 crc kubenswrapper[4722]: E1002 09:33:02.307068 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.316573 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hm66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9384fcdadf3735b0de65edc5af7e3d88b55d03043291aea0bc8cc8de945e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7k768\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hm66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:02Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.329755 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zwlns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04c84986-75a1-4683-8764-8f7307d1126a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460204a345893172e65dd3415be30a96dd48d7bce44d05971133cde3e43939af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghrmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zwlns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:02Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.342254 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a56e3a0-fea1-4ca5-9d2a-b0a28e16d684\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://737b3f37862e455f181b6864cab54bb6dae5c458ed9dcb9b7fda63a06d0f17fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb6cb31ce28cc0d6095e7f27187bdeb1c2ee2d3f19ac480c95ab8116bd776bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01dd91984bea279379c038d2fbb4a4ad45200b49443071e290c0bf6854af1ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 09:32:46.368533 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 09:32:46.368793 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 09:32:46.369600 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1554788403/tls.crt::/tmp/serving-cert-1554788403/tls.key\\\\\\\"\\\\nI1002 09:32:46.682405 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 09:32:46.685049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 09:32:46.685073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 09:32:46.685132 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 09:32:46.685139 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 09:32:46.690261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 09:32:46.690286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 09:32:46.690299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 09:32:46.690303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 09:32:46.690306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 09:32:46.690478 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 09:32:46.692277 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3b1c06b43a4fb138588fe71ad3f37dae41d5de647684ab490a2f4079018ac06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:02Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.348230 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.348279 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.348289 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.348305 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.348332 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:02Z","lastTransitionTime":"2025-10-02T09:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.354913 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://562dcc33d7454113e3dec145a8b2faba06d442892ac65301a5713c82e3ca73e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:02Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.361569 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.361705 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgg5w\" (UniqueName: \"kubernetes.io/projected/3d43b03f-1f3f-4824-9eda-93e405f6ee9e-kube-api-access-rgg5w\") pod \"network-metrics-daemon-jn9jw\" (UID: \"3d43b03f-1f3f-4824-9eda-93e405f6ee9e\") " pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:33:02 crc kubenswrapper[4722]: E1002 09:33:02.361759 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:33:18.361713358 +0000 UTC m=+54.398466338 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.361799 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3d43b03f-1f3f-4824-9eda-93e405f6ee9e-metrics-certs\") pod \"network-metrics-daemon-jn9jw\" (UID: \"3d43b03f-1f3f-4824-9eda-93e405f6ee9e\") " pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.368640 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sjg5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc362a8f-ced8-4af3-b908-f856e59845fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:33:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sjg5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:02Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.382291 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jn9jw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d43b03f-1f3f-4824-9eda-93e405f6ee9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgg5w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgg5w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jn9jw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:02Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.397470 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8589e9ec-e8ad-4b32-8846-841b689971b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf821bcfda8c6b7f9823ecb4a9dc7844eae7b1005d41e7bfe9b47df5d66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664f4e51f677aedd16cbcaed322e25d228c87936e30996f36dd3a5b97a43def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14f04556a2e11720ac3e92ec6f1c436468001e87b2984d007f9510a0e6ff0c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546efffc2572c644b7c949abadbaa2f8c9cc0449e08bb5615d3de0504f1bb9e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:02Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.411057 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:02Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.425923 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:02Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.445237 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3b8bb2b4eff73f696db5caf412dec0557cd23595cfd096e941280525aa683ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3b8bb2b4eff73f696db5caf412dec0557cd23595cfd096e941280525aa683ae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T09:33:00Z\\\",\\\"message\\\":\\\"ver/kube-apiserver-crc\\\\nI1002 09:33:00.844178 6127 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nF1002 09:33:00.844180 6127 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:00Z is after 2025-08-24T17:21:41Z]\\\\nI1002 09:33:00.844173 6127 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns-operator/metrics]} name:Service_openshift-dns-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:33:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-jlvpl_openshift-ovn-kubernetes(a900403a-9c51-4ca5-862f-2fbbd30cbcc3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jlvpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:02Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.450461 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.450499 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.450508 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.450525 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.450536 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:02Z","lastTransitionTime":"2025-10-02T09:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.459613 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3945e65-623a-40eb-82ae-8b17bcf8827b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497de5e5e05ba246ef4a053b62e09441e5e43342b0a61f446d7dabe2066b40b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cc2g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:02Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.462597 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.462650 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.462675 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.462699 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.462720 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgg5w\" (UniqueName: \"kubernetes.io/projected/3d43b03f-1f3f-4824-9eda-93e405f6ee9e-kube-api-access-rgg5w\") pod \"network-metrics-daemon-jn9jw\" (UID: \"3d43b03f-1f3f-4824-9eda-93e405f6ee9e\") " pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.462747 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3d43b03f-1f3f-4824-9eda-93e405f6ee9e-metrics-certs\") pod \"network-metrics-daemon-jn9jw\" (UID: \"3d43b03f-1f3f-4824-9eda-93e405f6ee9e\") " pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:33:02 crc kubenswrapper[4722]: E1002 09:33:02.462803 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 09:33:02 crc kubenswrapper[4722]: E1002 09:33:02.462812 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 09:33:02 crc kubenswrapper[4722]: E1002 09:33:02.462862 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 09:33:18.462845607 +0000 UTC m=+54.499598587 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 09:33:02 crc kubenswrapper[4722]: E1002 09:33:02.462904 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 09:33:02 crc kubenswrapper[4722]: E1002 09:33:02.462927 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 09:33:02 crc kubenswrapper[4722]: E1002 09:33:02.462940 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 09:33:02 crc kubenswrapper[4722]: E1002 09:33:02.462944 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d43b03f-1f3f-4824-9eda-93e405f6ee9e-metrics-certs podName:3d43b03f-1f3f-4824-9eda-93e405f6ee9e nodeName:}" failed. No retries permitted until 2025-10-02 09:33:02.962913428 +0000 UTC m=+38.999666408 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3d43b03f-1f3f-4824-9eda-93e405f6ee9e-metrics-certs") pod "network-metrics-daemon-jn9jw" (UID: "3d43b03f-1f3f-4824-9eda-93e405f6ee9e") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 09:33:02 crc kubenswrapper[4722]: E1002 09:33:02.462976 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 09:33:18.46296413 +0000 UTC m=+54.499717180 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 09:33:02 crc kubenswrapper[4722]: E1002 09:33:02.463035 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 09:33:02 crc kubenswrapper[4722]: E1002 09:33:02.463046 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 09:33:02 crc kubenswrapper[4722]: E1002 09:33:02.463085 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 09:33:02 crc kubenswrapper[4722]: E1002 09:33:02.463097 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 09:33:02 crc kubenswrapper[4722]: E1002 09:33:02.463069 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 09:33:18.463061602 +0000 UTC m=+54.499814652 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 09:33:02 crc kubenswrapper[4722]: E1002 09:33:02.463231 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 09:33:18.463177685 +0000 UTC m=+54.499930665 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.472297 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://059f5c846149b9005d20124d666d2964bccada1c171cb57e17c64555147b0ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ee5e5f4f56ae11d8f714c42b1c1e6044b04fa5cf4c353d6bfd08c9f3b9e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:02Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.487049 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgg5w\" (UniqueName: \"kubernetes.io/projected/3d43b03f-1f3f-4824-9eda-93e405f6ee9e-kube-api-access-rgg5w\") pod \"network-metrics-daemon-jn9jw\" (UID: \"3d43b03f-1f3f-4824-9eda-93e405f6ee9e\") " pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.489927 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a27c33baca44df058fd006a35ff882b94f64446256b26a7bcde4c22671a5ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:02Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.502969 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:02Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.514172 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f662826c-e772-4f56-a85f-83971aaef356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b262eb3201301d5e7cc940314a920dd22bcac685c0d91f412f27647028c7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc66c4529a0875a210fbc2b5af96db36ab208d651068958075de46898c1e150\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q6h76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:02Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.526158 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tsdt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb2b1d65-2472-468b-8995-eb1943a63a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e98c56ccc33119647035cca69beb3a8def018e7bb0cae463c6910e18e0f64a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tsdt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:02Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.552105 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.552143 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.552154 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.552172 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.552185 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:02Z","lastTransitionTime":"2025-10-02T09:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.654473 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.654525 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.654536 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.654551 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.654562 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:02Z","lastTransitionTime":"2025-10-02T09:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.709572 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.709572 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:33:02 crc kubenswrapper[4722]: E1002 09:33:02.709721 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:33:02 crc kubenswrapper[4722]: E1002 09:33:02.709742 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.709587 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:33:02 crc kubenswrapper[4722]: E1002 09:33:02.709796 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.756985 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.757032 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.757043 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.757058 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.757071 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:02Z","lastTransitionTime":"2025-10-02T09:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.818478 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.818512 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.818521 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.818535 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.818544 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:02Z","lastTransitionTime":"2025-10-02T09:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:02 crc kubenswrapper[4722]: E1002 09:33:02.830139 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"680c7bd3-d5b0-4b5c-92c1-2fb03a628b44\\\",\\\"systemUUID\\\":\\\"8b109f7e-b539-4abd-be49-fd9e033b3339\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:02Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.833360 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.833395 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.833404 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.833418 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.833428 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:02Z","lastTransitionTime":"2025-10-02T09:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:02 crc kubenswrapper[4722]: E1002 09:33:02.845251 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"680c7bd3-d5b0-4b5c-92c1-2fb03a628b44\\\",\\\"systemUUID\\\":\\\"8b109f7e-b539-4abd-be49-fd9e033b3339\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:02Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.848936 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.848969 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.848982 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.849000 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.849011 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:02Z","lastTransitionTime":"2025-10-02T09:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:02 crc kubenswrapper[4722]: E1002 09:33:02.859578 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"680c7bd3-d5b0-4b5c-92c1-2fb03a628b44\\\",\\\"systemUUID\\\":\\\"8b109f7e-b539-4abd-be49-fd9e033b3339\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:02Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.864660 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.864831 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.864855 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.864875 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.864887 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:02Z","lastTransitionTime":"2025-10-02T09:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:02 crc kubenswrapper[4722]: E1002 09:33:02.875985 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"680c7bd3-d5b0-4b5c-92c1-2fb03a628b44\\\",\\\"systemUUID\\\":\\\"8b109f7e-b539-4abd-be49-fd9e033b3339\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:02Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.882271 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.882343 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.882359 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.882382 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.882401 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:02Z","lastTransitionTime":"2025-10-02T09:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:02 crc kubenswrapper[4722]: E1002 09:33:02.900850 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"680c7bd3-d5b0-4b5c-92c1-2fb03a628b44\\\",\\\"systemUUID\\\":\\\"8b109f7e-b539-4abd-be49-fd9e033b3339\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:02Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:02 crc kubenswrapper[4722]: E1002 09:33:02.901052 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.903233 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.903273 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.903294 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.903331 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.903347 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:02Z","lastTransitionTime":"2025-10-02T09:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:02 crc kubenswrapper[4722]: I1002 09:33:02.967061 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3d43b03f-1f3f-4824-9eda-93e405f6ee9e-metrics-certs\") pod \"network-metrics-daemon-jn9jw\" (UID: \"3d43b03f-1f3f-4824-9eda-93e405f6ee9e\") " pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:33:02 crc kubenswrapper[4722]: E1002 09:33:02.967343 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 09:33:02 crc kubenswrapper[4722]: E1002 09:33:02.967495 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d43b03f-1f3f-4824-9eda-93e405f6ee9e-metrics-certs podName:3d43b03f-1f3f-4824-9eda-93e405f6ee9e nodeName:}" failed. No retries permitted until 2025-10-02 09:33:03.96746179 +0000 UTC m=+40.004214780 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3d43b03f-1f3f-4824-9eda-93e405f6ee9e-metrics-certs") pod "network-metrics-daemon-jn9jw" (UID: "3d43b03f-1f3f-4824-9eda-93e405f6ee9e") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.005784 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.005838 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.005847 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.005871 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.005884 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:03Z","lastTransitionTime":"2025-10-02T09:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.008353 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sjg5q" event={"ID":"bc362a8f-ced8-4af3-b908-f856e59845fe","Type":"ContainerStarted","Data":"ece7756234b7c8bc4d84fe1c8e1ed2c722cdcca266f4cd32d29cd5ff0df57b22"} Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.107733 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.107777 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.107786 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.107804 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.107817 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:03Z","lastTransitionTime":"2025-10-02T09:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.209941 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.209974 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.209984 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.210000 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.210009 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:03Z","lastTransitionTime":"2025-10-02T09:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.312958 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.313004 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.313020 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.313043 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.313060 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:03Z","lastTransitionTime":"2025-10-02T09:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.416188 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.416231 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.416243 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.416262 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.416275 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:03Z","lastTransitionTime":"2025-10-02T09:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.518742 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.518788 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.518799 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.518862 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.518876 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:03Z","lastTransitionTime":"2025-10-02T09:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.621473 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.621508 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.621516 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.621535 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.621545 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:03Z","lastTransitionTime":"2025-10-02T09:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.723208 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.723247 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.723259 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.723273 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.723287 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:03Z","lastTransitionTime":"2025-10-02T09:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.825433 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.825472 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.825484 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.825501 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.825512 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:03Z","lastTransitionTime":"2025-10-02T09:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.928955 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.929000 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.929010 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.929024 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.929033 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:03Z","lastTransitionTime":"2025-10-02T09:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:03 crc kubenswrapper[4722]: I1002 09:33:03.978811 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3d43b03f-1f3f-4824-9eda-93e405f6ee9e-metrics-certs\") pod \"network-metrics-daemon-jn9jw\" (UID: \"3d43b03f-1f3f-4824-9eda-93e405f6ee9e\") " pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:33:03 crc kubenswrapper[4722]: E1002 09:33:03.978936 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 09:33:03 crc kubenswrapper[4722]: E1002 09:33:03.979003 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d43b03f-1f3f-4824-9eda-93e405f6ee9e-metrics-certs podName:3d43b03f-1f3f-4824-9eda-93e405f6ee9e nodeName:}" failed. No retries permitted until 2025-10-02 09:33:05.978987217 +0000 UTC m=+42.015740197 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3d43b03f-1f3f-4824-9eda-93e405f6ee9e-metrics-certs") pod "network-metrics-daemon-jn9jw" (UID: "3d43b03f-1f3f-4824-9eda-93e405f6ee9e") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.013592 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sjg5q" event={"ID":"bc362a8f-ced8-4af3-b908-f856e59845fe","Type":"ContainerStarted","Data":"61c0fa498ca64b09ac096d8d7c72a5e2e6e95e2297e646ef2c6fd5bf0a0e60fd"} Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.027473 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a56e3a0-fea1-4ca5-9d2a-b0a28e16d684\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://737b3f37862e455f181b6864cab54bb6dae5c458ed9dcb9b7fda63a06d0f17fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb6cb31ce28cc0d6095e7f27187bdeb1c2ee2d3f19ac480c95ab8116bd776bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01dd91984bea279379c038d2fbb4a4ad45200b49443071e290c0bf6854af1ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 09:32:46.368533 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 09:32:46.368793 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 09:32:46.369600 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1554788403/tls.crt::/tmp/serving-cert-1554788403/tls.key\\\\\\\"\\\\nI1002 09:32:46.682405 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 09:32:46.685049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 09:32:46.685073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 09:32:46.685132 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 09:32:46.685139 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 09:32:46.690261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 09:32:46.690286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 09:32:46.690299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 09:32:46.690303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 09:32:46.690306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 09:32:46.690478 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 09:32:46.692277 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3b1c06b43a4fb138588fe71ad3f37dae41d5de647684ab490a2f4079018ac06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:04Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.030973 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.031020 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.031030 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.031050 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.031060 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:04Z","lastTransitionTime":"2025-10-02T09:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.037411 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hm66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9384fcdadf3735b0de65edc5af7e3d88b55d03043291aea0bc8cc8de945e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7k768\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hm66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:04Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.048022 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zwlns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04c84986-75a1-4683-8764-8f7307d1126a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460204a345893172e65dd3415be30a96dd48d7bce44d05971133cde3e43939af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghrmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zwlns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:04Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.058114 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8589e9ec-e8ad-4b32-8846-841b689971b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf821bcfda8c6b7f9823ecb4a9dc7844eae7b1005d41e7bfe9b47df5d66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664f4e51f677aedd16cbcaed322e25d228c87936e30996f36dd3a5b97a43def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14f04556a2e11720ac3e92ec6f1c436468001e87b2984d007f9510a0e6ff0c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546efffc2572c644b7c949abadbaa2f8c9cc0449e08bb5615d3de0504f1bb9e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:04Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.068231 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://562dcc33d7454113e3dec145a8b2faba06d442892ac65301a5713c82e3ca73e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:04Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.078616 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sjg5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc362a8f-ced8-4af3-b908-f856e59845fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece7756234b7c8bc4d84fe1c8e1ed2c722cdcca266f4cd32d29cd5ff0df57b22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61c0fa498ca64b09ac096d8d7c72a5e2e6e95e2297e646ef2c6fd5bf0a0e60fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:33:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sjg5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:04Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.087674 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jn9jw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d43b03f-1f3f-4824-9eda-93e405f6ee9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgg5w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgg5w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jn9jw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:04Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.099819 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3945e65-623a-40eb-82ae-8b17bcf8827b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497de5e5e05ba246ef4a053b62e09441e5e43342b0a61f446d7dabe2066b40b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cc2g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:04Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.110891 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:04Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.120642 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:04Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.133351 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.133379 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.133387 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.133400 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.133409 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:04Z","lastTransitionTime":"2025-10-02T09:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.137404 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3b8bb2b4eff73f696db5caf412dec0557cd23595cfd096e941280525aa683ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3b8bb2b4eff73f696db5caf412dec0557cd23595cfd096e941280525aa683ae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T09:33:00Z\\\",\\\"message\\\":\\\"ver/kube-apiserver-crc\\\\nI1002 09:33:00.844178 6127 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nF1002 09:33:00.844180 6127 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:00Z is after 2025-08-24T17:21:41Z]\\\\nI1002 09:33:00.844173 6127 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns-operator/metrics]} name:Service_openshift-dns-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:33:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-jlvpl_openshift-ovn-kubernetes(a900403a-9c51-4ca5-862f-2fbbd30cbcc3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jlvpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:04Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.149953 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:04Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.158631 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f662826c-e772-4f56-a85f-83971aaef356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b262eb3201301d5e7cc940314a920dd22bcac685c0d91f412f27647028c7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc66c4529a0875a210fbc2b5af96db36ab208d651068958075de46898c1e150\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q6h76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:04Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.167391 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tsdt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb2b1d65-2472-468b-8995-eb1943a63a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e98c56ccc33119647035cca69beb3a8def018e7bb0cae463c6910e18e0f64a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tsdt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:04Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.179426 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://059f5c846149b9005d20124d666d2964bccada1c171cb57e17c64555147b0ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ee5e5f4f56ae11d8f714c42b1c1e6044b04fa5cf4c353d6bfd08c9f3b9e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:04Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.191424 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a27c33baca44df058fd006a35ff882b94f64446256b26a7bcde4c22671a5ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:04Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.236495 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.236547 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.236560 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.236578 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.236592 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:04Z","lastTransitionTime":"2025-10-02T09:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.339683 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.339724 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.339734 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.339749 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.339765 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:04Z","lastTransitionTime":"2025-10-02T09:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.442369 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.442403 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.442414 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.442428 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.442437 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:04Z","lastTransitionTime":"2025-10-02T09:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.544848 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.544896 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.544916 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.544937 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.544953 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:04Z","lastTransitionTime":"2025-10-02T09:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.647846 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.647899 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.647909 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.647926 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.647986 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:04Z","lastTransitionTime":"2025-10-02T09:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.710017 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.710145 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.710182 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.710260 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:33:04 crc kubenswrapper[4722]: E1002 09:33:04.710285 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:33:04 crc kubenswrapper[4722]: E1002 09:33:04.710380 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:33:04 crc kubenswrapper[4722]: E1002 09:33:04.710446 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:33:04 crc kubenswrapper[4722]: E1002 09:33:04.710486 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.723426 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://059f5c846149b9005d20124d666d2964bccada1c171cb57e17c64555147b0ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ee5e5f4f56ae11d8f714c42b1c1e6044b04fa5cf4c353d6bfd08c9f3b9e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:04Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.735006 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a27c33baca44df058fd006a35ff882b94f64446256b26a7bcde4c22671a5ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:04Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.747222 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:04Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.750885 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.750912 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.750921 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.750935 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.750943 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:04Z","lastTransitionTime":"2025-10-02T09:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.762568 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f662826c-e772-4f56-a85f-83971aaef356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b262eb3201301d5e7cc940314a920dd22bcac685c0d91f412f27647028c7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc66c4529a0875a210fbc2b5af96db36ab208d651068958075de46898c1e150\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q6h76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:04Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.774989 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tsdt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb2b1d65-2472-468b-8995-eb1943a63a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e98c56ccc33119647035cca69beb3a8def018e7bb0cae463c6910e18e0f64a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tsdt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:04Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.791787 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hm66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9384fcdadf3735b0de65edc5af7e3d88b55d03043291aea0bc8cc8de945e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7k768\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hm66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:04Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.804122 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zwlns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04c84986-75a1-4683-8764-8f7307d1126a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460204a345893172e65dd3415be30a96dd48d7bce44d05971133cde3e43939af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghrmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zwlns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:04Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.816120 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a56e3a0-fea1-4ca5-9d2a-b0a28e16d684\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://737b3f37862e455f181b6864cab54bb6dae5c458ed9dcb9b7fda63a06d0f17fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb6cb31ce28cc0d6095e7f27187bdeb1c2ee2d3f19ac480c95ab8116bd776bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01dd91984bea279379c038d2fbb4a4ad45200b49443071e290c0bf6854af1ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 09:32:46.368533 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 09:32:46.368793 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 09:32:46.369600 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1554788403/tls.crt::/tmp/serving-cert-1554788403/tls.key\\\\\\\"\\\\nI1002 09:32:46.682405 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 09:32:46.685049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 09:32:46.685073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 09:32:46.685132 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 09:32:46.685139 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 09:32:46.690261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 09:32:46.690286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 09:32:46.690299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 09:32:46.690303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 09:32:46.690306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 09:32:46.690478 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 09:32:46.692277 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3b1c06b43a4fb138588fe71ad3f37dae41d5de647684ab490a2f4079018ac06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:04Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.829501 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://562dcc33d7454113e3dec145a8b2faba06d442892ac65301a5713c82e3ca73e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:04Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.840062 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sjg5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc362a8f-ced8-4af3-b908-f856e59845fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece7756234b7c8bc4d84fe1c8e1ed2c722cdcca266f4cd32d29cd5ff0df57b22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61c0fa498ca64b09ac096d8d7c72a5e2e6e95e2297e646ef2c6fd5bf0a0e60fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:33:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sjg5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:04Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.848587 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jn9jw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d43b03f-1f3f-4824-9eda-93e405f6ee9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgg5w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgg5w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jn9jw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:04Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.853079 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.853102 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.853110 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.853122 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.853132 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:04Z","lastTransitionTime":"2025-10-02T09:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.862658 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8589e9ec-e8ad-4b32-8846-841b689971b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf821bcfda8c6b7f9823ecb4a9dc7844eae7b1005d41e7bfe9b47df5d66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664f4e51f677aedd16cbcaed322e25d228c87936e30996f36dd3a5b97a43def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14f04556a2e11720ac3e92ec6f1c436468001e87b2984d007f9510a0e6ff0c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546efffc2572c644b7c949abadbaa2f8c9cc0449e08bb5615d3de0504f1bb9e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:04Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.875570 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:04Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.887840 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:04Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.904533 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3b8bb2b4eff73f696db5caf412dec0557cd23595cfd096e941280525aa683ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3b8bb2b4eff73f696db5caf412dec0557cd23595cfd096e941280525aa683ae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T09:33:00Z\\\",\\\"message\\\":\\\"ver/kube-apiserver-crc\\\\nI1002 09:33:00.844178 6127 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nF1002 09:33:00.844180 6127 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:00Z is after 2025-08-24T17:21:41Z]\\\\nI1002 09:33:00.844173 6127 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns-operator/metrics]} name:Service_openshift-dns-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:33:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-jlvpl_openshift-ovn-kubernetes(a900403a-9c51-4ca5-862f-2fbbd30cbcc3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jlvpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:04Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.920208 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3945e65-623a-40eb-82ae-8b17bcf8827b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497de5e5e05ba246ef4a053b62e09441e5e43342b0a61f446d7dabe2066b40b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cc2g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:04Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.954951 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.954991 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.955004 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.955023 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:04 crc kubenswrapper[4722]: I1002 09:33:04.955033 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:04Z","lastTransitionTime":"2025-10-02T09:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.057199 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.057441 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.057527 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.057587 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.057650 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:05Z","lastTransitionTime":"2025-10-02T09:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.159919 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.160207 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.160272 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.160361 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.160426 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:05Z","lastTransitionTime":"2025-10-02T09:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.262356 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.262392 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.262404 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.262419 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.262430 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:05Z","lastTransitionTime":"2025-10-02T09:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.365076 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.365128 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.365141 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.365158 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.365169 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:05Z","lastTransitionTime":"2025-10-02T09:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.467015 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.467050 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.467063 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.467081 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.467094 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:05Z","lastTransitionTime":"2025-10-02T09:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.569702 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.569741 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.569753 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.569768 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.569779 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:05Z","lastTransitionTime":"2025-10-02T09:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.671902 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.671943 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.671952 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.671965 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.671973 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:05Z","lastTransitionTime":"2025-10-02T09:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.774740 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.774784 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.774794 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.774808 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.774817 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:05Z","lastTransitionTime":"2025-10-02T09:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.877474 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.877540 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.877550 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.877567 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.877603 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:05Z","lastTransitionTime":"2025-10-02T09:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.980824 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.980861 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.980870 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.980885 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:05 crc kubenswrapper[4722]: I1002 09:33:05.980896 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:05Z","lastTransitionTime":"2025-10-02T09:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:05.999975 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3d43b03f-1f3f-4824-9eda-93e405f6ee9e-metrics-certs\") pod \"network-metrics-daemon-jn9jw\" (UID: \"3d43b03f-1f3f-4824-9eda-93e405f6ee9e\") " pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:33:06 crc kubenswrapper[4722]: E1002 09:33:06.000241 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 09:33:06 crc kubenswrapper[4722]: E1002 09:33:06.000289 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d43b03f-1f3f-4824-9eda-93e405f6ee9e-metrics-certs podName:3d43b03f-1f3f-4824-9eda-93e405f6ee9e nodeName:}" failed. No retries permitted until 2025-10-02 09:33:10.000275914 +0000 UTC m=+46.037028914 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3d43b03f-1f3f-4824-9eda-93e405f6ee9e-metrics-certs") pod "network-metrics-daemon-jn9jw" (UID: "3d43b03f-1f3f-4824-9eda-93e405f6ee9e") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.083115 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.083154 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.083166 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.083181 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.083192 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:06Z","lastTransitionTime":"2025-10-02T09:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.186252 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.186328 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.186339 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.186353 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.186362 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:06Z","lastTransitionTime":"2025-10-02T09:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.288524 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.288562 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.288571 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.288585 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.288595 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:06Z","lastTransitionTime":"2025-10-02T09:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.390459 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.390492 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.390502 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.390517 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.390528 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:06Z","lastTransitionTime":"2025-10-02T09:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.492801 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.492837 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.492847 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.492863 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.492876 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:06Z","lastTransitionTime":"2025-10-02T09:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.595437 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.595729 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.595817 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.595906 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.596013 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:06Z","lastTransitionTime":"2025-10-02T09:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.699392 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.699449 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.699461 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.699479 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.699526 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:06Z","lastTransitionTime":"2025-10-02T09:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.710021 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.710022 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.710029 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:33:06 crc kubenswrapper[4722]: E1002 09:33:06.710394 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:33:06 crc kubenswrapper[4722]: E1002 09:33:06.710404 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:33:06 crc kubenswrapper[4722]: E1002 09:33:06.710466 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.711012 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:33:06 crc kubenswrapper[4722]: E1002 09:33:06.711356 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.802026 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.802093 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.802108 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.802123 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.802136 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:06Z","lastTransitionTime":"2025-10-02T09:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.905112 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.905162 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.905174 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.905193 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:06 crc kubenswrapper[4722]: I1002 09:33:06.905207 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:06Z","lastTransitionTime":"2025-10-02T09:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.007970 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.008022 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.008034 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.008053 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.008065 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:07Z","lastTransitionTime":"2025-10-02T09:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.110200 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.110238 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.110249 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.110287 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.110300 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:07Z","lastTransitionTime":"2025-10-02T09:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.212548 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.212778 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.212845 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.212919 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.212978 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:07Z","lastTransitionTime":"2025-10-02T09:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.315005 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.315062 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.315071 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.315086 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.315095 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:07Z","lastTransitionTime":"2025-10-02T09:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.417191 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.417491 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.417618 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.417701 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.417777 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:07Z","lastTransitionTime":"2025-10-02T09:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.520407 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.520656 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.520720 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.520823 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.520889 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:07Z","lastTransitionTime":"2025-10-02T09:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.623992 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.624299 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.624405 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.624480 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.624561 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:07Z","lastTransitionTime":"2025-10-02T09:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.711041 4722 scope.go:117] "RemoveContainer" containerID="ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02" Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.729270 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.729305 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.729328 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.729342 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.729354 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:07Z","lastTransitionTime":"2025-10-02T09:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.832213 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.832248 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.832261 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.832276 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.832285 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:07Z","lastTransitionTime":"2025-10-02T09:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.934918 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.934965 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.934979 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.934996 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:07 crc kubenswrapper[4722]: I1002 09:33:07.935006 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:07Z","lastTransitionTime":"2025-10-02T09:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.032572 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.034068 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"df1cccd0fca2c109ddcdd7a8dbbdeda8524b30c01769c733e4ac6ffb6fde621d"} Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.034444 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.036707 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.036738 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.036748 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.036762 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.036776 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:08Z","lastTransitionTime":"2025-10-02T09:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.049493 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zwlns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04c84986-75a1-4683-8764-8f7307d1126a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460204a345893172e65dd3415be30a96dd48d7bce44d05971133cde3e43939af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghrmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zwlns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:08Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.063818 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a56e3a0-fea1-4ca5-9d2a-b0a28e16d684\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://737b3f37862e455f181b6864cab54bb6dae5c458ed9dcb9b7fda63a06d0f17fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb6cb31ce28cc0d6095e7f27187bdeb1c2ee2d3f19ac480c95ab8116bd776bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01dd91984bea279379c038d2fbb4a4ad45200b49443071e290c0bf6854af1ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df1cccd0fca2c109ddcdd7a8dbbdeda8524b30c01769c733e4ac6ffb6fde621d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 09:32:46.368533 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 09:32:46.368793 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 09:32:46.369600 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1554788403/tls.crt::/tmp/serving-cert-1554788403/tls.key\\\\\\\"\\\\nI1002 09:32:46.682405 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 09:32:46.685049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 09:32:46.685073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 09:32:46.685132 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 09:32:46.685139 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 09:32:46.690261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 09:32:46.690286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 09:32:46.690299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 09:32:46.690303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 09:32:46.690306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 09:32:46.690478 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 09:32:46.692277 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3b1c06b43a4fb138588fe71ad3f37dae41d5de647684ab490a2f4079018ac06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:08Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.074420 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hm66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9384fcdadf3735b0de65edc5af7e3d88b55d03043291aea0bc8cc8de945e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7k768\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hm66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:08Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.088923 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sjg5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc362a8f-ced8-4af3-b908-f856e59845fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece7756234b7c8bc4d84fe1c8e1ed2c722cdcca266f4cd32d29cd5ff0df57b22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61c0fa498ca64b09ac096d8d7c72a5e2e6e95e2297e646ef2c6fd5bf0a0e60fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:33:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sjg5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:08Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.101106 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jn9jw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d43b03f-1f3f-4824-9eda-93e405f6ee9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgg5w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgg5w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jn9jw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:08Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.112744 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8589e9ec-e8ad-4b32-8846-841b689971b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf821bcfda8c6b7f9823ecb4a9dc7844eae7b1005d41e7bfe9b47df5d66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664f4e51f677aedd16cbcaed322e25d228c87936e30996f36dd3a5b97a43def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14f04556a2e11720ac3e92ec6f1c436468001e87b2984d007f9510a0e6ff0c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546efffc2572c644b7c949abadbaa2f8c9cc0449e08bb5615d3de0504f1bb9e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:08Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.125812 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://562dcc33d7454113e3dec145a8b2faba06d442892ac65301a5713c82e3ca73e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:08Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.139482 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.139525 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.139534 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.139552 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.139569 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:08Z","lastTransitionTime":"2025-10-02T09:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.140385 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:08Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.158957 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3b8bb2b4eff73f696db5caf412dec0557cd23595cfd096e941280525aa683ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3b8bb2b4eff73f696db5caf412dec0557cd23595cfd096e941280525aa683ae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T09:33:00Z\\\",\\\"message\\\":\\\"ver/kube-apiserver-crc\\\\nI1002 09:33:00.844178 6127 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nF1002 09:33:00.844180 6127 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:00Z is after 2025-08-24T17:21:41Z]\\\\nI1002 09:33:00.844173 6127 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns-operator/metrics]} name:Service_openshift-dns-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:33:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-jlvpl_openshift-ovn-kubernetes(a900403a-9c51-4ca5-862f-2fbbd30cbcc3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jlvpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:08Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.175173 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3945e65-623a-40eb-82ae-8b17bcf8827b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497de5e5e05ba246ef4a053b62e09441e5e43342b0a61f446d7dabe2066b40b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cc2g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:08Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.187543 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:08Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.197701 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://059f5c846149b9005d20124d666d2964bccada1c171cb57e17c64555147b0ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ee5e5f4f56ae11d8f714c42b1c1e6044b04fa5cf4c353d6bfd08c9f3b9e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:08Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.208931 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a27c33baca44df058fd006a35ff882b94f64446256b26a7bcde4c22671a5ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:08Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.220125 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:08Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.229840 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f662826c-e772-4f56-a85f-83971aaef356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b262eb3201301d5e7cc940314a920dd22bcac685c0d91f412f27647028c7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc66c4529a0875a210fbc2b5af96db36ab208d651068958075de46898c1e150\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q6h76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:08Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.238229 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tsdt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb2b1d65-2472-468b-8995-eb1943a63a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e98c56ccc33119647035cca69beb3a8def018e7bb0cae463c6910e18e0f64a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tsdt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:08Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.241980 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.242025 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.242036 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.242052 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.242064 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:08Z","lastTransitionTime":"2025-10-02T09:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.344166 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.344208 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.344215 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.344230 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.344240 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:08Z","lastTransitionTime":"2025-10-02T09:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.446664 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.446708 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.446717 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.446732 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.446742 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:08Z","lastTransitionTime":"2025-10-02T09:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.549043 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.549075 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.549083 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.549096 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.549105 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:08Z","lastTransitionTime":"2025-10-02T09:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.651540 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.651609 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.651626 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.651643 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.651654 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:08Z","lastTransitionTime":"2025-10-02T09:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.710114 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.710150 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.710185 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.710219 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:33:08 crc kubenswrapper[4722]: E1002 09:33:08.710278 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:33:08 crc kubenswrapper[4722]: E1002 09:33:08.710422 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:33:08 crc kubenswrapper[4722]: E1002 09:33:08.710499 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:33:08 crc kubenswrapper[4722]: E1002 09:33:08.710543 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.753765 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.753820 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.753833 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.753848 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.753858 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:08Z","lastTransitionTime":"2025-10-02T09:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.856251 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.856304 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.856332 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.856347 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.856357 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:08Z","lastTransitionTime":"2025-10-02T09:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.958086 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.958133 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.958143 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.958158 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:08 crc kubenswrapper[4722]: I1002 09:33:08.958169 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:08Z","lastTransitionTime":"2025-10-02T09:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.059721 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.059760 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.059772 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.059789 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.059800 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:09Z","lastTransitionTime":"2025-10-02T09:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.162445 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.162495 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.162506 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.162541 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.162555 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:09Z","lastTransitionTime":"2025-10-02T09:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.264667 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.264715 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.264726 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.264743 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.264754 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:09Z","lastTransitionTime":"2025-10-02T09:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.367086 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.367124 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.367135 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.367152 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.367165 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:09Z","lastTransitionTime":"2025-10-02T09:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.469905 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.469942 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.469953 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.469971 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.469982 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:09Z","lastTransitionTime":"2025-10-02T09:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.571866 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.571904 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.571912 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.571927 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.571937 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:09Z","lastTransitionTime":"2025-10-02T09:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.674709 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.674766 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.674777 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.674791 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.674800 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:09Z","lastTransitionTime":"2025-10-02T09:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.777375 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.777428 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.777440 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.777459 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.777471 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:09Z","lastTransitionTime":"2025-10-02T09:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.879500 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.879571 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.879580 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.879617 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.879627 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:09Z","lastTransitionTime":"2025-10-02T09:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.985415 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.985465 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.985475 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.985492 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:09 crc kubenswrapper[4722]: I1002 09:33:09.985502 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:09Z","lastTransitionTime":"2025-10-02T09:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.041619 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3d43b03f-1f3f-4824-9eda-93e405f6ee9e-metrics-certs\") pod \"network-metrics-daemon-jn9jw\" (UID: \"3d43b03f-1f3f-4824-9eda-93e405f6ee9e\") " pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:33:10 crc kubenswrapper[4722]: E1002 09:33:10.041807 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 09:33:10 crc kubenswrapper[4722]: E1002 09:33:10.041901 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d43b03f-1f3f-4824-9eda-93e405f6ee9e-metrics-certs podName:3d43b03f-1f3f-4824-9eda-93e405f6ee9e nodeName:}" failed. No retries permitted until 2025-10-02 09:33:18.041878463 +0000 UTC m=+54.078631473 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3d43b03f-1f3f-4824-9eda-93e405f6ee9e-metrics-certs") pod "network-metrics-daemon-jn9jw" (UID: "3d43b03f-1f3f-4824-9eda-93e405f6ee9e") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.087294 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.087377 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.087393 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.087409 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.087422 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:10Z","lastTransitionTime":"2025-10-02T09:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.190220 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.190270 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.190281 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.190299 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.190355 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:10Z","lastTransitionTime":"2025-10-02T09:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.293571 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.293797 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.293805 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.293820 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.293829 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:10Z","lastTransitionTime":"2025-10-02T09:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.396990 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.397039 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.397053 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.397071 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.397082 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:10Z","lastTransitionTime":"2025-10-02T09:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.499831 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.499885 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.499898 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.499916 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.499930 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:10Z","lastTransitionTime":"2025-10-02T09:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.602146 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.602193 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.602208 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.602225 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.602240 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:10Z","lastTransitionTime":"2025-10-02T09:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.705125 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.705166 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.705174 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.705188 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.705199 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:10Z","lastTransitionTime":"2025-10-02T09:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.709480 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.709525 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.709563 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:33:10 crc kubenswrapper[4722]: E1002 09:33:10.709602 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.709638 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:33:10 crc kubenswrapper[4722]: E1002 09:33:10.709673 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:33:10 crc kubenswrapper[4722]: E1002 09:33:10.709718 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:33:10 crc kubenswrapper[4722]: E1002 09:33:10.709757 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.807165 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.807211 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.807222 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.807240 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.807251 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:10Z","lastTransitionTime":"2025-10-02T09:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.909513 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.909570 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.909580 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.909598 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:10 crc kubenswrapper[4722]: I1002 09:33:10.909609 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:10Z","lastTransitionTime":"2025-10-02T09:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.012282 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.012338 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.012352 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.012369 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.012381 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:11Z","lastTransitionTime":"2025-10-02T09:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.114960 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.115005 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.115014 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.115029 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.115039 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:11Z","lastTransitionTime":"2025-10-02T09:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.217154 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.217214 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.217231 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.217254 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.217265 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:11Z","lastTransitionTime":"2025-10-02T09:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.319562 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.319604 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.319615 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.319632 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.319645 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:11Z","lastTransitionTime":"2025-10-02T09:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.422292 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.422355 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.422372 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.422387 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.422397 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:11Z","lastTransitionTime":"2025-10-02T09:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.524687 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.524724 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.524733 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.524748 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.524757 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:11Z","lastTransitionTime":"2025-10-02T09:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.628017 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.628077 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.628094 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.628115 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.628126 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:11Z","lastTransitionTime":"2025-10-02T09:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.730520 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.730557 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.730567 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.730582 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.730593 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:11Z","lastTransitionTime":"2025-10-02T09:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.833064 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.833100 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.833112 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.833126 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.833139 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:11Z","lastTransitionTime":"2025-10-02T09:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.935761 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.935832 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.935850 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.935867 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:11 crc kubenswrapper[4722]: I1002 09:33:11.935878 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:11Z","lastTransitionTime":"2025-10-02T09:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.039030 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.039106 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.039117 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.039133 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.039142 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:12Z","lastTransitionTime":"2025-10-02T09:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.143503 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.143570 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.143583 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.143604 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.143619 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:12Z","lastTransitionTime":"2025-10-02T09:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.246468 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.246527 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.246536 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.246554 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.246564 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:12Z","lastTransitionTime":"2025-10-02T09:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.349649 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.349695 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.349705 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.349721 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.349734 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:12Z","lastTransitionTime":"2025-10-02T09:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.453267 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.453414 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.453441 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.453476 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.453510 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:12Z","lastTransitionTime":"2025-10-02T09:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.556264 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.556353 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.556365 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.556381 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.556394 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:12Z","lastTransitionTime":"2025-10-02T09:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.658857 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.658904 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.658913 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.658935 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.658946 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:12Z","lastTransitionTime":"2025-10-02T09:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.709967 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.709976 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:33:12 crc kubenswrapper[4722]: E1002 09:33:12.710170 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.710005 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.709987 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:33:12 crc kubenswrapper[4722]: E1002 09:33:12.710336 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:33:12 crc kubenswrapper[4722]: E1002 09:33:12.710414 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:33:12 crc kubenswrapper[4722]: E1002 09:33:12.710508 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.761113 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.761155 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.761166 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.761181 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.761191 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:12Z","lastTransitionTime":"2025-10-02T09:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.864173 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.864210 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.864220 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.864234 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.864243 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:12Z","lastTransitionTime":"2025-10-02T09:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.937766 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.937807 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.937818 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.937836 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.937849 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:12Z","lastTransitionTime":"2025-10-02T09:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:12 crc kubenswrapper[4722]: E1002 09:33:12.952227 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"680c7bd3-d5b0-4b5c-92c1-2fb03a628b44\\\",\\\"systemUUID\\\":\\\"8b109f7e-b539-4abd-be49-fd9e033b3339\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:12Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.957101 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.957131 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.957144 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.957160 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.957172 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:12Z","lastTransitionTime":"2025-10-02T09:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:12 crc kubenswrapper[4722]: E1002 09:33:12.968656 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"680c7bd3-d5b0-4b5c-92c1-2fb03a628b44\\\",\\\"systemUUID\\\":\\\"8b109f7e-b539-4abd-be49-fd9e033b3339\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:12Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.972996 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.973033 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.973043 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.973056 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.973067 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:12Z","lastTransitionTime":"2025-10-02T09:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:12 crc kubenswrapper[4722]: E1002 09:33:12.986026 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"680c7bd3-d5b0-4b5c-92c1-2fb03a628b44\\\",\\\"systemUUID\\\":\\\"8b109f7e-b539-4abd-be49-fd9e033b3339\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:12Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.989831 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.989868 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.989877 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.989892 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:12 crc kubenswrapper[4722]: I1002 09:33:12.989903 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:12Z","lastTransitionTime":"2025-10-02T09:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:13 crc kubenswrapper[4722]: E1002 09:33:13.002202 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"680c7bd3-d5b0-4b5c-92c1-2fb03a628b44\\\",\\\"systemUUID\\\":\\\"8b109f7e-b539-4abd-be49-fd9e033b3339\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:12Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.005755 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.005789 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.005801 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.005817 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.005828 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:13Z","lastTransitionTime":"2025-10-02T09:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:13 crc kubenswrapper[4722]: E1002 09:33:13.016829 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"680c7bd3-d5b0-4b5c-92c1-2fb03a628b44\\\",\\\"systemUUID\\\":\\\"8b109f7e-b539-4abd-be49-fd9e033b3339\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:13Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:13 crc kubenswrapper[4722]: E1002 09:33:13.016995 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.018699 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.018737 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.018749 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.018765 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.018776 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:13Z","lastTransitionTime":"2025-10-02T09:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.121455 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.121505 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.121517 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.121537 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.121555 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:13Z","lastTransitionTime":"2025-10-02T09:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.223297 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.223345 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.223355 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.223371 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.223382 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:13Z","lastTransitionTime":"2025-10-02T09:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.325797 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.326051 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.326185 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.326276 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.326361 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:13Z","lastTransitionTime":"2025-10-02T09:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.428215 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.428536 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.428607 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.428690 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.428764 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:13Z","lastTransitionTime":"2025-10-02T09:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.531643 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.531687 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.531696 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.531714 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.531726 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:13Z","lastTransitionTime":"2025-10-02T09:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.634254 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.634286 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.634294 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.634309 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.634346 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:13Z","lastTransitionTime":"2025-10-02T09:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.736950 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.736991 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.737003 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.737019 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.737031 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:13Z","lastTransitionTime":"2025-10-02T09:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.839413 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.839452 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.839461 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.839475 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.839484 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:13Z","lastTransitionTime":"2025-10-02T09:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.942371 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.942411 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.942419 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.942437 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:13 crc kubenswrapper[4722]: I1002 09:33:13.942447 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:13Z","lastTransitionTime":"2025-10-02T09:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.045921 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.045963 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.045972 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.045987 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.045997 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:14Z","lastTransitionTime":"2025-10-02T09:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.148002 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.148042 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.148052 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.148068 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.148078 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:14Z","lastTransitionTime":"2025-10-02T09:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.250578 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.250626 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.250640 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.250655 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.250667 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:14Z","lastTransitionTime":"2025-10-02T09:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.353068 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.353109 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.353118 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.353132 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.353141 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:14Z","lastTransitionTime":"2025-10-02T09:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.455625 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.455671 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.455686 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.455704 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.455715 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:14Z","lastTransitionTime":"2025-10-02T09:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.558623 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.558667 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.558679 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.558698 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.558711 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:14Z","lastTransitionTime":"2025-10-02T09:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.614530 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.615449 4722 scope.go:117] "RemoveContainer" containerID="a3b8bb2b4eff73f696db5caf412dec0557cd23595cfd096e941280525aa683ae" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.661075 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.661113 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.661126 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.661143 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.661153 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:14Z","lastTransitionTime":"2025-10-02T09:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.710175 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.710245 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.710286 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:33:14 crc kubenswrapper[4722]: E1002 09:33:14.710468 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.710919 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:33:14 crc kubenswrapper[4722]: E1002 09:33:14.711177 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:33:14 crc kubenswrapper[4722]: E1002 09:33:14.711559 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:33:14 crc kubenswrapper[4722]: E1002 09:33:14.711741 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.723835 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8589e9ec-e8ad-4b32-8846-841b689971b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf821bcfda8c6b7f9823ecb4a9dc7844eae7b1005d41e7bfe9b47df5d66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664f4e51f677aedd16cbcaed322e25d228c87936e30996f36dd3a5b97a43def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14f04556a2e11720ac3e92ec6f1c436468001e87b2984d007f9510a0e6ff0c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546efffc2572c644b7c949abadbaa2f8c9cc0449e08bb5615d3de0504f1bb9e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:14Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.741492 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://562dcc33d7454113e3dec145a8b2faba06d442892ac65301a5713c82e3ca73e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:14Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.753407 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sjg5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc362a8f-ced8-4af3-b908-f856e59845fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece7756234b7c8bc4d84fe1c8e1ed2c722cdcca266f4cd32d29cd5ff0df57b22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61c0fa498ca64b09ac096d8d7c72a5e2e6e95e2297e646ef2c6fd5bf0a0e60fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:33:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sjg5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:14Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.763845 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.763884 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.763896 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.763914 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.763926 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:14Z","lastTransitionTime":"2025-10-02T09:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.765207 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jn9jw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d43b03f-1f3f-4824-9eda-93e405f6ee9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgg5w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgg5w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jn9jw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:14Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.779180 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3945e65-623a-40eb-82ae-8b17bcf8827b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497de5e5e05ba246ef4a053b62e09441e5e43342b0a61f446d7dabe2066b40b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cc2g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:14Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.796628 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:14Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.811243 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:14Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.832175 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3b8bb2b4eff73f696db5caf412dec0557cd23595cfd096e941280525aa683ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3b8bb2b4eff73f696db5caf412dec0557cd23595cfd096e941280525aa683ae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T09:33:00Z\\\",\\\"message\\\":\\\"ver/kube-apiserver-crc\\\\nI1002 09:33:00.844178 6127 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nF1002 09:33:00.844180 6127 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:00Z is after 2025-08-24T17:21:41Z]\\\\nI1002 09:33:00.844173 6127 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns-operator/metrics]} name:Service_openshift-dns-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:33:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-jlvpl_openshift-ovn-kubernetes(a900403a-9c51-4ca5-862f-2fbbd30cbcc3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jlvpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:14Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.844552 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:14Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.855188 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f662826c-e772-4f56-a85f-83971aaef356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b262eb3201301d5e7cc940314a920dd22bcac685c0d91f412f27647028c7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc66c4529a0875a210fbc2b5af96db36ab208d651068958075de46898c1e150\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q6h76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:14Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.866957 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.866991 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.866999 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.867012 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.867024 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:14Z","lastTransitionTime":"2025-10-02T09:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.867063 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tsdt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb2b1d65-2472-468b-8995-eb1943a63a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e98c56ccc33119647035cca69beb3a8def018e7bb0cae463c6910e18e0f64a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tsdt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:14Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.879695 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://059f5c846149b9005d20124d666d2964bccada1c171cb57e17c64555147b0ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ee5e5f4f56ae11d8f714c42b1c1e6044b04fa5cf4c353d6bfd08c9f3b9e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:14Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.891464 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a27c33baca44df058fd006a35ff882b94f64446256b26a7bcde4c22671a5ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:14Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.905618 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a56e3a0-fea1-4ca5-9d2a-b0a28e16d684\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://737b3f37862e455f181b6864cab54bb6dae5c458ed9dcb9b7fda63a06d0f17fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb6cb31ce28cc0d6095e7f27187bdeb1c2ee2d3f19ac480c95ab8116bd776bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01dd91984bea279379c038d2fbb4a4ad45200b49443071e290c0bf6854af1ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df1cccd0fca2c109ddcdd7a8dbbdeda8524b30c01769c733e4ac6ffb6fde621d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 09:32:46.368533 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 09:32:46.368793 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 09:32:46.369600 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1554788403/tls.crt::/tmp/serving-cert-1554788403/tls.key\\\\\\\"\\\\nI1002 09:32:46.682405 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 09:32:46.685049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 09:32:46.685073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 09:32:46.685132 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 09:32:46.685139 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 09:32:46.690261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 09:32:46.690286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 09:32:46.690299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 09:32:46.690303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 09:32:46.690306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 09:32:46.690478 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 09:32:46.692277 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3b1c06b43a4fb138588fe71ad3f37dae41d5de647684ab490a2f4079018ac06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:14Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.915291 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hm66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9384fcdadf3735b0de65edc5af7e3d88b55d03043291aea0bc8cc8de945e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7k768\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hm66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:14Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.929014 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zwlns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04c84986-75a1-4683-8764-8f7307d1126a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460204a345893172e65dd3415be30a96dd48d7bce44d05971133cde3e43939af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghrmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zwlns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:14Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.969110 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.969169 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.969182 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.969199 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:14 crc kubenswrapper[4722]: I1002 09:33:14.969212 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:14Z","lastTransitionTime":"2025-10-02T09:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.055859 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jlvpl_a900403a-9c51-4ca5-862f-2fbbd30cbcc3/ovnkube-controller/1.log" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.057519 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" event={"ID":"a900403a-9c51-4ca5-862f-2fbbd30cbcc3","Type":"ContainerStarted","Data":"d680497dd4585ef08ee0f4b4ee45281ecdda251e7331302b9df8f5e4751727c1"} Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.058582 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.070968 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.071243 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.071401 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.071508 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.071638 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:15Z","lastTransitionTime":"2025-10-02T09:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.075437 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tsdt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb2b1d65-2472-468b-8995-eb1943a63a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e98c56ccc33119647035cca69beb3a8def018e7bb0cae463c6910e18e0f64a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tsdt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:15Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.093270 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://059f5c846149b9005d20124d666d2964bccada1c171cb57e17c64555147b0ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ee5e5f4f56ae11d8f714c42b1c1e6044b04fa5cf4c353d6bfd08c9f3b9e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:15Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.109827 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a27c33baca44df058fd006a35ff882b94f64446256b26a7bcde4c22671a5ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:15Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.126204 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:15Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.135982 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f662826c-e772-4f56-a85f-83971aaef356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b262eb3201301d5e7cc940314a920dd22bcac685c0d91f412f27647028c7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc66c4529a0875a210fbc2b5af96db36ab208d651068958075de46898c1e150\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q6h76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:15Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.148546 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a56e3a0-fea1-4ca5-9d2a-b0a28e16d684\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://737b3f37862e455f181b6864cab54bb6dae5c458ed9dcb9b7fda63a06d0f17fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb6cb31ce28cc0d6095e7f27187bdeb1c2ee2d3f19ac480c95ab8116bd776bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01dd91984bea279379c038d2fbb4a4ad45200b49443071e290c0bf6854af1ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df1cccd0fca2c109ddcdd7a8dbbdeda8524b30c01769c733e4ac6ffb6fde621d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 09:32:46.368533 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 09:32:46.368793 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 09:32:46.369600 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1554788403/tls.crt::/tmp/serving-cert-1554788403/tls.key\\\\\\\"\\\\nI1002 09:32:46.682405 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 09:32:46.685049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 09:32:46.685073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 09:32:46.685132 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 09:32:46.685139 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 09:32:46.690261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 09:32:46.690286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 09:32:46.690299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 09:32:46.690303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 09:32:46.690306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 09:32:46.690478 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 09:32:46.692277 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3b1c06b43a4fb138588fe71ad3f37dae41d5de647684ab490a2f4079018ac06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:15Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.159159 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hm66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9384fcdadf3735b0de65edc5af7e3d88b55d03043291aea0bc8cc8de945e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7k768\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hm66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:15Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.171809 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zwlns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04c84986-75a1-4683-8764-8f7307d1126a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460204a345893172e65dd3415be30a96dd48d7bce44d05971133cde3e43939af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghrmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zwlns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:15Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.174470 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.174510 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.174518 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.174534 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.174543 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:15Z","lastTransitionTime":"2025-10-02T09:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.184954 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8589e9ec-e8ad-4b32-8846-841b689971b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf821bcfda8c6b7f9823ecb4a9dc7844eae7b1005d41e7bfe9b47df5d66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664f4e51f677aedd16cbcaed322e25d228c87936e30996f36dd3a5b97a43def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14f04556a2e11720ac3e92ec6f1c436468001e87b2984d007f9510a0e6ff0c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546efffc2572c644b7c949abadbaa2f8c9cc0449e08bb5615d3de0504f1bb9e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:15Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.199644 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://562dcc33d7454113e3dec145a8b2faba06d442892ac65301a5713c82e3ca73e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:15Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.212581 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sjg5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc362a8f-ced8-4af3-b908-f856e59845fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece7756234b7c8bc4d84fe1c8e1ed2c722cdcca266f4cd32d29cd5ff0df57b22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61c0fa498ca64b09ac096d8d7c72a5e2e6e95e2297e646ef2c6fd5bf0a0e60fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:33:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sjg5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:15Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.222440 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jn9jw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d43b03f-1f3f-4824-9eda-93e405f6ee9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgg5w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgg5w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jn9jw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:15Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.235561 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:15Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.249251 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:15Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.267489 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d680497dd4585ef08ee0f4b4ee45281ecdda251e7331302b9df8f5e4751727c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3b8bb2b4eff73f696db5caf412dec0557cd23595cfd096e941280525aa683ae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T09:33:00Z\\\",\\\"message\\\":\\\"ver/kube-apiserver-crc\\\\nI1002 09:33:00.844178 6127 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nF1002 09:33:00.844180 6127 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:00Z is after 2025-08-24T17:21:41Z]\\\\nI1002 09:33:00.844173 6127 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns-operator/metrics]} name:Service_openshift-dns-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:33:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jlvpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:15Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.276991 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.277035 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.277046 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.277063 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.277075 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:15Z","lastTransitionTime":"2025-10-02T09:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.293023 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3945e65-623a-40eb-82ae-8b17bcf8827b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497de5e5e05ba246ef4a053b62e09441e5e43342b0a61f446d7dabe2066b40b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cc2g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:15Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.378950 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.379186 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.379331 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.379429 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.379525 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:15Z","lastTransitionTime":"2025-10-02T09:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.482428 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.482777 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.482886 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.482978 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.483046 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:15Z","lastTransitionTime":"2025-10-02T09:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.586121 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.586172 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.586186 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.586202 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.586216 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:15Z","lastTransitionTime":"2025-10-02T09:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.688509 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.688547 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.688594 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.688607 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.688617 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:15Z","lastTransitionTime":"2025-10-02T09:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.790857 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.790974 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.790988 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.791003 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.791015 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:15Z","lastTransitionTime":"2025-10-02T09:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.894007 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.894053 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.894064 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.894084 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.894096 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:15Z","lastTransitionTime":"2025-10-02T09:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.996427 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.996487 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.996498 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.996518 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:15 crc kubenswrapper[4722]: I1002 09:33:15.996530 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:15Z","lastTransitionTime":"2025-10-02T09:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.062843 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jlvpl_a900403a-9c51-4ca5-862f-2fbbd30cbcc3/ovnkube-controller/2.log" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.063932 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jlvpl_a900403a-9c51-4ca5-862f-2fbbd30cbcc3/ovnkube-controller/1.log" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.066394 4722 generic.go:334] "Generic (PLEG): container finished" podID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerID="d680497dd4585ef08ee0f4b4ee45281ecdda251e7331302b9df8f5e4751727c1" exitCode=1 Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.066436 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" event={"ID":"a900403a-9c51-4ca5-862f-2fbbd30cbcc3","Type":"ContainerDied","Data":"d680497dd4585ef08ee0f4b4ee45281ecdda251e7331302b9df8f5e4751727c1"} Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.066495 4722 scope.go:117] "RemoveContainer" containerID="a3b8bb2b4eff73f696db5caf412dec0557cd23595cfd096e941280525aa683ae" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.067110 4722 scope.go:117] "RemoveContainer" containerID="d680497dd4585ef08ee0f4b4ee45281ecdda251e7331302b9df8f5e4751727c1" Oct 02 09:33:16 crc kubenswrapper[4722]: E1002 09:33:16.067303 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jlvpl_openshift-ovn-kubernetes(a900403a-9c51-4ca5-862f-2fbbd30cbcc3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.084523 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:16Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.098587 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f662826c-e772-4f56-a85f-83971aaef356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b262eb3201301d5e7cc940314a920dd22bcac685c0d91f412f27647028c7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc66c4529a0875a210fbc2b5af96db36ab208d651068958075de46898c1e150\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q6h76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:16Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.098813 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.098827 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.098834 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.098848 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.098858 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:16Z","lastTransitionTime":"2025-10-02T09:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.108526 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tsdt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb2b1d65-2472-468b-8995-eb1943a63a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e98c56ccc33119647035cca69beb3a8def018e7bb0cae463c6910e18e0f64a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tsdt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:16Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.120383 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://059f5c846149b9005d20124d666d2964bccada1c171cb57e17c64555147b0ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ee5e5f4f56ae11d8f714c42b1c1e6044b04fa5cf4c353d6bfd08c9f3b9e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:16Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.130955 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a27c33baca44df058fd006a35ff882b94f64446256b26a7bcde4c22671a5ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:16Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.144116 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a56e3a0-fea1-4ca5-9d2a-b0a28e16d684\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://737b3f37862e455f181b6864cab54bb6dae5c458ed9dcb9b7fda63a06d0f17fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb6cb31ce28cc0d6095e7f27187bdeb1c2ee2d3f19ac480c95ab8116bd776bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01dd91984bea279379c038d2fbb4a4ad45200b49443071e290c0bf6854af1ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df1cccd0fca2c109ddcdd7a8dbbdeda8524b30c01769c733e4ac6ffb6fde621d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 09:32:46.368533 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 09:32:46.368793 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 09:32:46.369600 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1554788403/tls.crt::/tmp/serving-cert-1554788403/tls.key\\\\\\\"\\\\nI1002 09:32:46.682405 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 09:32:46.685049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 09:32:46.685073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 09:32:46.685132 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 09:32:46.685139 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 09:32:46.690261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 09:32:46.690286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 09:32:46.690299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 09:32:46.690303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 09:32:46.690306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 09:32:46.690478 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 09:32:46.692277 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3b1c06b43a4fb138588fe71ad3f37dae41d5de647684ab490a2f4079018ac06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:16Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.154421 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hm66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9384fcdadf3735b0de65edc5af7e3d88b55d03043291aea0bc8cc8de945e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7k768\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hm66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:16Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.167907 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zwlns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04c84986-75a1-4683-8764-8f7307d1126a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460204a345893172e65dd3415be30a96dd48d7bce44d05971133cde3e43939af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghrmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zwlns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:16Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.184124 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8589e9ec-e8ad-4b32-8846-841b689971b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf821bcfda8c6b7f9823ecb4a9dc7844eae7b1005d41e7bfe9b47df5d66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664f4e51f677aedd16cbcaed322e25d228c87936e30996f36dd3a5b97a43def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14f04556a2e11720ac3e92ec6f1c436468001e87b2984d007f9510a0e6ff0c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546efffc2572c644b7c949abadbaa2f8c9cc0449e08bb5615d3de0504f1bb9e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:16Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.200144 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://562dcc33d7454113e3dec145a8b2faba06d442892ac65301a5713c82e3ca73e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:16Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.201894 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.201951 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.201963 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.201990 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.202003 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:16Z","lastTransitionTime":"2025-10-02T09:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.216238 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sjg5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc362a8f-ced8-4af3-b908-f856e59845fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece7756234b7c8bc4d84fe1c8e1ed2c722cdcca266f4cd32d29cd5ff0df57b22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61c0fa498ca64b09ac096d8d7c72a5e2e6e95e2297e646ef2c6fd5bf0a0e60fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:33:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sjg5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:16Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.228986 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jn9jw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d43b03f-1f3f-4824-9eda-93e405f6ee9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgg5w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgg5w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jn9jw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:16Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.245606 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3945e65-623a-40eb-82ae-8b17bcf8827b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497de5e5e05ba246ef4a053b62e09441e5e43342b0a61f446d7dabe2066b40b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cc2g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:16Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.259573 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:16Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.273301 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:16Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.296096 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d680497dd4585ef08ee0f4b4ee45281ecdda251e7331302b9df8f5e4751727c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3b8bb2b4eff73f696db5caf412dec0557cd23595cfd096e941280525aa683ae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T09:33:00Z\\\",\\\"message\\\":\\\"ver/kube-apiserver-crc\\\\nI1002 09:33:00.844178 6127 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nF1002 09:33:00.844180 6127 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:00Z is after 2025-08-24T17:21:41Z]\\\\nI1002 09:33:00.844173 6127 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns-operator/metrics]} name:Service_openshift-dns-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:33:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d680497dd4585ef08ee0f4b4ee45281ecdda251e7331302b9df8f5e4751727c1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T09:33:15Z\\\",\\\"message\\\":\\\"uilt lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-service-ca-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.40\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 09:33:15.499400 6373 services_controller.go:356] Processing sync for service openshift-apiserver-operator/metrics for network=default\\\\nI1002 09:33:15.498907 6373 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-jlvpl in node crc\\\\nI1002 09:33:15.499439 6373 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-jlvpl after 0 failed attempt(s)\\\\nI1002 09:33:15.499444 6373 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-jlvpl\\\\nF1002 09:33:15.498925 6373 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:33:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jlvpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:16Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.304815 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.304858 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.304866 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.304881 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.304891 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:16Z","lastTransitionTime":"2025-10-02T09:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.407918 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.407965 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.407985 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.408002 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.408014 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:16Z","lastTransitionTime":"2025-10-02T09:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.460598 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.471470 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.476536 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jn9jw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d43b03f-1f3f-4824-9eda-93e405f6ee9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgg5w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgg5w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jn9jw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:16Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.488900 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8589e9ec-e8ad-4b32-8846-841b689971b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf821bcfda8c6b7f9823ecb4a9dc7844eae7b1005d41e7bfe9b47df5d66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664f4e51f677aedd16cbcaed322e25d228c87936e30996f36dd3a5b97a43def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14f04556a2e11720ac3e92ec6f1c436468001e87b2984d007f9510a0e6ff0c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546efffc2572c644b7c949abadbaa2f8c9cc0449e08bb5615d3de0504f1bb9e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:16Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.504960 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://562dcc33d7454113e3dec145a8b2faba06d442892ac65301a5713c82e3ca73e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:16Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.510237 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.510266 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.510274 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.510287 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.510297 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:16Z","lastTransitionTime":"2025-10-02T09:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.516692 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sjg5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc362a8f-ced8-4af3-b908-f856e59845fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece7756234b7c8bc4d84fe1c8e1ed2c722cdcca266f4cd32d29cd5ff0df57b22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61c0fa498ca64b09ac096d8d7c72a5e2e6e95e2297e646ef2c6fd5bf0a0e60fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:33:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sjg5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:16Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.536024 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d680497dd4585ef08ee0f4b4ee45281ecdda251e7331302b9df8f5e4751727c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3b8bb2b4eff73f696db5caf412dec0557cd23595cfd096e941280525aa683ae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T09:33:00Z\\\",\\\"message\\\":\\\"ver/kube-apiserver-crc\\\\nI1002 09:33:00.844178 6127 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nF1002 09:33:00.844180 6127 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:00Z is after 2025-08-24T17:21:41Z]\\\\nI1002 09:33:00.844173 6127 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns-operator/metrics]} name:Service_openshift-dns-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:33:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d680497dd4585ef08ee0f4b4ee45281ecdda251e7331302b9df8f5e4751727c1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T09:33:15Z\\\",\\\"message\\\":\\\"uilt lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-service-ca-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.40\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 09:33:15.499400 6373 services_controller.go:356] Processing sync for service openshift-apiserver-operator/metrics for network=default\\\\nI1002 09:33:15.498907 6373 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-jlvpl in node crc\\\\nI1002 09:33:15.499439 6373 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-jlvpl after 0 failed attempt(s)\\\\nI1002 09:33:15.499444 6373 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-jlvpl\\\\nF1002 09:33:15.498925 6373 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:33:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jlvpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:16Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.552501 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3945e65-623a-40eb-82ae-8b17bcf8827b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497de5e5e05ba246ef4a053b62e09441e5e43342b0a61f446d7dabe2066b40b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cc2g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:16Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.567555 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:16Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.582781 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:16Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.596731 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a27c33baca44df058fd006a35ff882b94f64446256b26a7bcde4c22671a5ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:16Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.612033 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:16Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.613418 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.613456 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.613469 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.613490 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.613502 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:16Z","lastTransitionTime":"2025-10-02T09:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.624769 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f662826c-e772-4f56-a85f-83971aaef356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b262eb3201301d5e7cc940314a920dd22bcac685c0d91f412f27647028c7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc66c4529a0875a210fbc2b5af96db36ab208d651068958075de46898c1e150\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q6h76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:16Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.634007 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tsdt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb2b1d65-2472-468b-8995-eb1943a63a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e98c56ccc33119647035cca69beb3a8def018e7bb0cae463c6910e18e0f64a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tsdt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:16Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.645447 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://059f5c846149b9005d20124d666d2964bccada1c171cb57e17c64555147b0ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ee5e5f4f56ae11d8f714c42b1c1e6044b04fa5cf4c353d6bfd08c9f3b9e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:16Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.659085 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a56e3a0-fea1-4ca5-9d2a-b0a28e16d684\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://737b3f37862e455f181b6864cab54bb6dae5c458ed9dcb9b7fda63a06d0f17fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb6cb31ce28cc0d6095e7f27187bdeb1c2ee2d3f19ac480c95ab8116bd776bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01dd91984bea279379c038d2fbb4a4ad45200b49443071e290c0bf6854af1ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df1cccd0fca2c109ddcdd7a8dbbdeda8524b30c01769c733e4ac6ffb6fde621d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 09:32:46.368533 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 09:32:46.368793 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 09:32:46.369600 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1554788403/tls.crt::/tmp/serving-cert-1554788403/tls.key\\\\\\\"\\\\nI1002 09:32:46.682405 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 09:32:46.685049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 09:32:46.685073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 09:32:46.685132 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 09:32:46.685139 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 09:32:46.690261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 09:32:46.690286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 09:32:46.690299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 09:32:46.690303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 09:32:46.690306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 09:32:46.690478 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 09:32:46.692277 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3b1c06b43a4fb138588fe71ad3f37dae41d5de647684ab490a2f4079018ac06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:16Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.669504 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hm66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9384fcdadf3735b0de65edc5af7e3d88b55d03043291aea0bc8cc8de945e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7k768\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hm66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:16Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.682429 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zwlns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04c84986-75a1-4683-8764-8f7307d1126a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460204a345893172e65dd3415be30a96dd48d7bce44d05971133cde3e43939af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghrmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zwlns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:16Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.709705 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.709770 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.709826 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:33:16 crc kubenswrapper[4722]: E1002 09:33:16.709851 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:33:16 crc kubenswrapper[4722]: E1002 09:33:16.710045 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.710144 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:33:16 crc kubenswrapper[4722]: E1002 09:33:16.710390 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:33:16 crc kubenswrapper[4722]: E1002 09:33:16.710658 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.716229 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.716277 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.716295 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.716332 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.716346 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:16Z","lastTransitionTime":"2025-10-02T09:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.819806 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.819837 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.819845 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.819858 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.819868 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:16Z","lastTransitionTime":"2025-10-02T09:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.923259 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.923304 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.923330 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.923388 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:16 crc kubenswrapper[4722]: I1002 09:33:16.923402 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:16Z","lastTransitionTime":"2025-10-02T09:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.025330 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.025368 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.025378 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.025391 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.025401 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:17Z","lastTransitionTime":"2025-10-02T09:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.072462 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jlvpl_a900403a-9c51-4ca5-862f-2fbbd30cbcc3/ovnkube-controller/2.log" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.076032 4722 scope.go:117] "RemoveContainer" containerID="d680497dd4585ef08ee0f4b4ee45281ecdda251e7331302b9df8f5e4751727c1" Oct 02 09:33:17 crc kubenswrapper[4722]: E1002 09:33:17.076199 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jlvpl_openshift-ovn-kubernetes(a900403a-9c51-4ca5-862f-2fbbd30cbcc3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.090175 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:17Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.103774 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:17Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.122111 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d680497dd4585ef08ee0f4b4ee45281ecdda251e7331302b9df8f5e4751727c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d680497dd4585ef08ee0f4b4ee45281ecdda251e7331302b9df8f5e4751727c1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T09:33:15Z\\\",\\\"message\\\":\\\"uilt lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-service-ca-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.40\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 09:33:15.499400 6373 services_controller.go:356] Processing sync for service openshift-apiserver-operator/metrics for network=default\\\\nI1002 09:33:15.498907 6373 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-jlvpl in node crc\\\\nI1002 09:33:15.499439 6373 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-jlvpl after 0 failed attempt(s)\\\\nI1002 09:33:15.499444 6373 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-jlvpl\\\\nF1002 09:33:15.498925 6373 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:33:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jlvpl_openshift-ovn-kubernetes(a900403a-9c51-4ca5-862f-2fbbd30cbcc3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jlvpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:17Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.127124 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.127165 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.127174 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.127189 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.127203 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:17Z","lastTransitionTime":"2025-10-02T09:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.138021 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3945e65-623a-40eb-82ae-8b17bcf8827b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497de5e5e05ba246ef4a053b62e09441e5e43342b0a61f446d7dabe2066b40b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cc2g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:17Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.152467 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f662826c-e772-4f56-a85f-83971aaef356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b262eb3201301d5e7cc940314a920dd22bcac685c0d91f412f27647028c7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc66c4529a0875a210fbc2b5af96db36ab208d651068958075de46898c1e150\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q6h76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:17Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.164682 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tsdt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb2b1d65-2472-468b-8995-eb1943a63a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e98c56ccc33119647035cca69beb3a8def018e7bb0cae463c6910e18e0f64a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tsdt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:17Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.177547 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1756cb2-0822-4b79-b527-99dad76e6b19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4650b87cfdd4d4075b3c3c3a7417d8ae6c01222820fa783143fcf963904a9215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247e21bc25ecb430b611fbc0d7260465b80365a76340c4d89652a0e0affdc39a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09276f774f0a48df9df5b99cce34366b7113c49a0c9e8e269996127d93c6015d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a875abefffed9954368ae8e08d3101ae90708e53f82e7eaa447d6164e41a446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a875abefffed9954368ae8e08d3101ae90708e53f82e7eaa447d6164e41a446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:17Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.188918 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://059f5c846149b9005d20124d666d2964bccada1c171cb57e17c64555147b0ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ee5e5f4f56ae11d8f714c42b1c1e6044b04fa5cf4c353d6bfd08c9f3b9e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:17Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.201002 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a27c33baca44df058fd006a35ff882b94f64446256b26a7bcde4c22671a5ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:17Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.212503 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:17Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.226128 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a56e3a0-fea1-4ca5-9d2a-b0a28e16d684\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://737b3f37862e455f181b6864cab54bb6dae5c458ed9dcb9b7fda63a06d0f17fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb6cb31ce28cc0d6095e7f27187bdeb1c2ee2d3f19ac480c95ab8116bd776bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01dd91984bea279379c038d2fbb4a4ad45200b49443071e290c0bf6854af1ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df1cccd0fca2c109ddcdd7a8dbbdeda8524b30c01769c733e4ac6ffb6fde621d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 09:32:46.368533 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 09:32:46.368793 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 09:32:46.369600 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1554788403/tls.crt::/tmp/serving-cert-1554788403/tls.key\\\\\\\"\\\\nI1002 09:32:46.682405 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 09:32:46.685049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 09:32:46.685073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 09:32:46.685132 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 09:32:46.685139 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 09:32:46.690261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 09:32:46.690286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 09:32:46.690299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 09:32:46.690303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 09:32:46.690306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 09:32:46.690478 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 09:32:46.692277 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3b1c06b43a4fb138588fe71ad3f37dae41d5de647684ab490a2f4079018ac06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:17Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.229858 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.229902 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.229917 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.229932 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.229945 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:17Z","lastTransitionTime":"2025-10-02T09:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.235254 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hm66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9384fcdadf3735b0de65edc5af7e3d88b55d03043291aea0bc8cc8de945e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7k768\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hm66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:17Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.246403 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zwlns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04c84986-75a1-4683-8764-8f7307d1126a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460204a345893172e65dd3415be30a96dd48d7bce44d05971133cde3e43939af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghrmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zwlns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:17Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.257273 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8589e9ec-e8ad-4b32-8846-841b689971b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf821bcfda8c6b7f9823ecb4a9dc7844eae7b1005d41e7bfe9b47df5d66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664f4e51f677aedd16cbcaed322e25d228c87936e30996f36dd3a5b97a43def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14f04556a2e11720ac3e92ec6f1c436468001e87b2984d007f9510a0e6ff0c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546efffc2572c644b7c949abadbaa2f8c9cc0449e08bb5615d3de0504f1bb9e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:17Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.268661 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://562dcc33d7454113e3dec145a8b2faba06d442892ac65301a5713c82e3ca73e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:17Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.278460 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sjg5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc362a8f-ced8-4af3-b908-f856e59845fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece7756234b7c8bc4d84fe1c8e1ed2c722cdcca266f4cd32d29cd5ff0df57b22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61c0fa498ca64b09ac096d8d7c72a5e2e6e95e2297e646ef2c6fd5bf0a0e60fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:33:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sjg5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:17Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.287874 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jn9jw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d43b03f-1f3f-4824-9eda-93e405f6ee9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgg5w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgg5w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jn9jw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:17Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.331806 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.332050 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.332201 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.332307 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.332468 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:17Z","lastTransitionTime":"2025-10-02T09:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.435029 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.435414 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.435523 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.435637 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.435725 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:17Z","lastTransitionTime":"2025-10-02T09:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.538516 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.538546 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.538555 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.538568 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.538577 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:17Z","lastTransitionTime":"2025-10-02T09:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.641236 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.641275 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.641283 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.641295 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.641304 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:17Z","lastTransitionTime":"2025-10-02T09:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.744648 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.744687 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.744696 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.744712 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.744722 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:17Z","lastTransitionTime":"2025-10-02T09:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.846694 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.846758 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.846769 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.846790 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.846799 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:17Z","lastTransitionTime":"2025-10-02T09:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.949755 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.949812 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.949823 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.949848 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:17 crc kubenswrapper[4722]: I1002 09:33:17.949861 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:17Z","lastTransitionTime":"2025-10-02T09:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.053164 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.053220 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.053230 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.053286 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.053308 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:18Z","lastTransitionTime":"2025-10-02T09:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.126879 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3d43b03f-1f3f-4824-9eda-93e405f6ee9e-metrics-certs\") pod \"network-metrics-daemon-jn9jw\" (UID: \"3d43b03f-1f3f-4824-9eda-93e405f6ee9e\") " pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:33:18 crc kubenswrapper[4722]: E1002 09:33:18.127094 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 09:33:18 crc kubenswrapper[4722]: E1002 09:33:18.127221 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d43b03f-1f3f-4824-9eda-93e405f6ee9e-metrics-certs podName:3d43b03f-1f3f-4824-9eda-93e405f6ee9e nodeName:}" failed. No retries permitted until 2025-10-02 09:33:34.127190087 +0000 UTC m=+70.163943137 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3d43b03f-1f3f-4824-9eda-93e405f6ee9e-metrics-certs") pod "network-metrics-daemon-jn9jw" (UID: "3d43b03f-1f3f-4824-9eda-93e405f6ee9e") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.156851 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.156903 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.156916 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.156934 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.156946 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:18Z","lastTransitionTime":"2025-10-02T09:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.260432 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.260503 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.260532 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.260565 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.260595 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:18Z","lastTransitionTime":"2025-10-02T09:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.363635 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.363675 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.363684 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.363705 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.363715 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:18Z","lastTransitionTime":"2025-10-02T09:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.430991 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:33:18 crc kubenswrapper[4722]: E1002 09:33:18.431492 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:33:50.431462635 +0000 UTC m=+86.468215655 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.466245 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.466342 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.466362 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.466392 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.466411 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:18Z","lastTransitionTime":"2025-10-02T09:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.532115 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.532174 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.532192 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.532213 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:33:18 crc kubenswrapper[4722]: E1002 09:33:18.532388 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 09:33:18 crc kubenswrapper[4722]: E1002 09:33:18.532408 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 09:33:18 crc kubenswrapper[4722]: E1002 09:33:18.532420 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 09:33:18 crc kubenswrapper[4722]: E1002 09:33:18.532464 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 09:33:18 crc kubenswrapper[4722]: E1002 09:33:18.532555 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 09:33:18 crc kubenswrapper[4722]: E1002 09:33:18.532480 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 09:33:50.532465199 +0000 UTC m=+86.569218179 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 09:33:18 crc kubenswrapper[4722]: E1002 09:33:18.532614 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 09:33:18 crc kubenswrapper[4722]: E1002 09:33:18.532646 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 09:33:18 crc kubenswrapper[4722]: E1002 09:33:18.532620 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 09:33:18 crc kubenswrapper[4722]: E1002 09:33:18.532704 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 09:33:50.532635924 +0000 UTC m=+86.569388924 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 09:33:18 crc kubenswrapper[4722]: E1002 09:33:18.532742 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 09:33:50.532725206 +0000 UTC m=+86.569478426 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 09:33:18 crc kubenswrapper[4722]: E1002 09:33:18.532768 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 09:33:50.532755627 +0000 UTC m=+86.569508847 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.569729 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.569882 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.569906 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.569944 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.569964 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:18Z","lastTransitionTime":"2025-10-02T09:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.673176 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.673231 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.673255 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.673275 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.673286 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:18Z","lastTransitionTime":"2025-10-02T09:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.709908 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.709988 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.709934 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.709926 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:33:18 crc kubenswrapper[4722]: E1002 09:33:18.710151 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:33:18 crc kubenswrapper[4722]: E1002 09:33:18.710298 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:33:18 crc kubenswrapper[4722]: E1002 09:33:18.710464 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:33:18 crc kubenswrapper[4722]: E1002 09:33:18.710616 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.776707 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.776742 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.776751 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.776772 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.776783 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:18Z","lastTransitionTime":"2025-10-02T09:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.880145 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.880296 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.880400 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.880486 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.880513 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:18Z","lastTransitionTime":"2025-10-02T09:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.984338 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.984383 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.984395 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.984412 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:18 crc kubenswrapper[4722]: I1002 09:33:18.984424 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:18Z","lastTransitionTime":"2025-10-02T09:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:19 crc kubenswrapper[4722]: I1002 09:33:19.087391 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:19 crc kubenswrapper[4722]: I1002 09:33:19.087456 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:19 crc kubenswrapper[4722]: I1002 09:33:19.087470 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:19 crc kubenswrapper[4722]: I1002 09:33:19.087492 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:19 crc kubenswrapper[4722]: I1002 09:33:19.087509 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:19Z","lastTransitionTime":"2025-10-02T09:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:19 crc kubenswrapper[4722]: I1002 09:33:19.190812 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:19 crc kubenswrapper[4722]: I1002 09:33:19.190868 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:19 crc kubenswrapper[4722]: I1002 09:33:19.190881 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:19 crc kubenswrapper[4722]: I1002 09:33:19.190903 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:19 crc kubenswrapper[4722]: I1002 09:33:19.190918 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:19Z","lastTransitionTime":"2025-10-02T09:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:19 crc kubenswrapper[4722]: I1002 09:33:19.294639 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:19 crc kubenswrapper[4722]: I1002 09:33:19.294721 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:19 crc kubenswrapper[4722]: I1002 09:33:19.294754 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:19 crc kubenswrapper[4722]: I1002 09:33:19.294786 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:19 crc kubenswrapper[4722]: I1002 09:33:19.294808 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:19Z","lastTransitionTime":"2025-10-02T09:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:19 crc kubenswrapper[4722]: I1002 09:33:19.397444 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:19 crc kubenswrapper[4722]: I1002 09:33:19.397491 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:19 crc kubenswrapper[4722]: I1002 09:33:19.397503 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:19 crc kubenswrapper[4722]: I1002 09:33:19.397523 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:19 crc kubenswrapper[4722]: I1002 09:33:19.397536 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:19Z","lastTransitionTime":"2025-10-02T09:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:19 crc kubenswrapper[4722]: I1002 09:33:19.500465 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:19 crc kubenswrapper[4722]: I1002 09:33:19.500508 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:19 crc kubenswrapper[4722]: I1002 09:33:19.500518 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:19 crc kubenswrapper[4722]: I1002 09:33:19.500540 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:19 crc kubenswrapper[4722]: I1002 09:33:19.500551 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:19Z","lastTransitionTime":"2025-10-02T09:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:19 crc kubenswrapper[4722]: I1002 09:33:19.602745 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:19 crc kubenswrapper[4722]: I1002 09:33:19.603182 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:19 crc kubenswrapper[4722]: I1002 09:33:19.603201 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:19 crc kubenswrapper[4722]: I1002 09:33:19.603232 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:19 crc kubenswrapper[4722]: I1002 09:33:19.603252 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:19Z","lastTransitionTime":"2025-10-02T09:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:19 crc kubenswrapper[4722]: I1002 09:33:19.705800 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:19 crc kubenswrapper[4722]: I1002 09:33:19.705856 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:19 crc kubenswrapper[4722]: I1002 09:33:19.705865 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:19 crc kubenswrapper[4722]: I1002 09:33:19.705885 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:19 crc kubenswrapper[4722]: I1002 09:33:19.705896 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:19Z","lastTransitionTime":"2025-10-02T09:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:19 crc kubenswrapper[4722]: I1002 09:33:19.809101 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:19 crc kubenswrapper[4722]: I1002 09:33:19.809184 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:19 crc kubenswrapper[4722]: I1002 09:33:19.809207 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:19 crc kubenswrapper[4722]: I1002 09:33:19.809239 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:19 crc kubenswrapper[4722]: I1002 09:33:19.809262 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:19Z","lastTransitionTime":"2025-10-02T09:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:19 crc kubenswrapper[4722]: I1002 09:33:19.912463 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:19 crc kubenswrapper[4722]: I1002 09:33:19.912552 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:19 crc kubenswrapper[4722]: I1002 09:33:19.912576 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:19 crc kubenswrapper[4722]: I1002 09:33:19.912612 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:19 crc kubenswrapper[4722]: I1002 09:33:19.912636 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:19Z","lastTransitionTime":"2025-10-02T09:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.015600 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.015638 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.015648 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.015663 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.015676 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:20Z","lastTransitionTime":"2025-10-02T09:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.117977 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.118014 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.118024 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.118037 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.118047 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:20Z","lastTransitionTime":"2025-10-02T09:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.219970 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.220005 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.220015 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.220029 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.220038 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:20Z","lastTransitionTime":"2025-10-02T09:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.323426 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.323539 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.323555 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.323585 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.323598 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:20Z","lastTransitionTime":"2025-10-02T09:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.426017 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.426087 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.426096 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.426112 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.426122 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:20Z","lastTransitionTime":"2025-10-02T09:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.528793 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.528842 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.528851 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.528866 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.528880 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:20Z","lastTransitionTime":"2025-10-02T09:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.632607 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.632687 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.632706 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.632738 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.632758 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:20Z","lastTransitionTime":"2025-10-02T09:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.710107 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.710177 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.710213 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.710107 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:33:20 crc kubenswrapper[4722]: E1002 09:33:20.710359 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:33:20 crc kubenswrapper[4722]: E1002 09:33:20.710464 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:33:20 crc kubenswrapper[4722]: E1002 09:33:20.710563 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:33:20 crc kubenswrapper[4722]: E1002 09:33:20.710721 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.735243 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.735279 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.735289 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.735304 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.735330 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:20Z","lastTransitionTime":"2025-10-02T09:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.839421 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.839462 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.839471 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.839487 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.839500 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:20Z","lastTransitionTime":"2025-10-02T09:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.942528 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.942602 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.942626 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.942663 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:20 crc kubenswrapper[4722]: I1002 09:33:20.942687 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:20Z","lastTransitionTime":"2025-10-02T09:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.046887 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.046971 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.046991 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.047033 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.047057 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:21Z","lastTransitionTime":"2025-10-02T09:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.149380 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.149422 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.149435 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.149455 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.149501 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:21Z","lastTransitionTime":"2025-10-02T09:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.251896 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.251961 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.251971 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.251987 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.252000 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:21Z","lastTransitionTime":"2025-10-02T09:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.354638 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.354680 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.354688 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.354706 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.354718 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:21Z","lastTransitionTime":"2025-10-02T09:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.457493 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.457546 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.457556 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.457575 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.457585 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:21Z","lastTransitionTime":"2025-10-02T09:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.560101 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.560178 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.560191 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.560231 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.560241 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:21Z","lastTransitionTime":"2025-10-02T09:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.663507 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.663600 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.663621 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.663660 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.663686 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:21Z","lastTransitionTime":"2025-10-02T09:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.766000 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.766043 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.766054 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.766069 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.766080 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:21Z","lastTransitionTime":"2025-10-02T09:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.869010 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.869056 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.869068 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.869083 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.869095 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:21Z","lastTransitionTime":"2025-10-02T09:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.971298 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.971385 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.971396 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.971413 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:21 crc kubenswrapper[4722]: I1002 09:33:21.971427 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:21Z","lastTransitionTime":"2025-10-02T09:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.074632 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.074684 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.074693 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.074707 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.074716 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:22Z","lastTransitionTime":"2025-10-02T09:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.177185 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.177240 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.177257 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.177282 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.177294 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:22Z","lastTransitionTime":"2025-10-02T09:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.280148 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.280232 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.280260 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.280291 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.280310 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:22Z","lastTransitionTime":"2025-10-02T09:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.383251 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.383304 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.383337 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.383353 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.383370 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:22Z","lastTransitionTime":"2025-10-02T09:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.486329 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.486376 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.486386 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.486400 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.486409 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:22Z","lastTransitionTime":"2025-10-02T09:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.588938 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.589017 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.589026 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.589042 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.589051 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:22Z","lastTransitionTime":"2025-10-02T09:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.691752 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.691788 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.691796 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.691810 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.691819 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:22Z","lastTransitionTime":"2025-10-02T09:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.709279 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.709338 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.709366 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:33:22 crc kubenswrapper[4722]: E1002 09:33:22.709474 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.709492 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:33:22 crc kubenswrapper[4722]: E1002 09:33:22.709575 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:33:22 crc kubenswrapper[4722]: E1002 09:33:22.709701 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:33:22 crc kubenswrapper[4722]: E1002 09:33:22.709759 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.793805 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.793845 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.793854 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.793870 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.793882 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:22Z","lastTransitionTime":"2025-10-02T09:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.896605 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.896637 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.896646 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.896662 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.896677 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:22Z","lastTransitionTime":"2025-10-02T09:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.999926 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.999973 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:22.999983 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:22 crc kubenswrapper[4722]: I1002 09:33:23.000001 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.000012 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:22Z","lastTransitionTime":"2025-10-02T09:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.056726 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.071360 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a56e3a0-fea1-4ca5-9d2a-b0a28e16d684\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://737b3f37862e455f181b6864cab54bb6dae5c458ed9dcb9b7fda63a06d0f17fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb6cb31ce28cc0d6095e7f27187bdeb1c2ee2d3f19ac480c95ab8116bd776bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01dd91984bea279379c038d2fbb4a4ad45200b49443071e290c0bf6854af1ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df1cccd0fca2c109ddcdd7a8dbbdeda8524b30c01769c733e4ac6ffb6fde621d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 09:32:46.368533 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 09:32:46.368793 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 09:32:46.369600 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1554788403/tls.crt::/tmp/serving-cert-1554788403/tls.key\\\\\\\"\\\\nI1002 09:32:46.682405 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 09:32:46.685049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 09:32:46.685073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 09:32:46.685132 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 09:32:46.685139 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 09:32:46.690261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 09:32:46.690286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 09:32:46.690299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 09:32:46.690303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 09:32:46.690306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 09:32:46.690478 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 09:32:46.692277 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3b1c06b43a4fb138588fe71ad3f37dae41d5de647684ab490a2f4079018ac06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:23Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.082046 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hm66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9384fcdadf3735b0de65edc5af7e3d88b55d03043291aea0bc8cc8de945e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7k768\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hm66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:23Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.094450 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zwlns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04c84986-75a1-4683-8764-8f7307d1126a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460204a345893172e65dd3415be30a96dd48d7bce44d05971133cde3e43939af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghrmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zwlns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:23Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.101809 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.101874 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.101886 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.101901 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.101910 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:23Z","lastTransitionTime":"2025-10-02T09:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.107780 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8589e9ec-e8ad-4b32-8846-841b689971b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf821bcfda8c6b7f9823ecb4a9dc7844eae7b1005d41e7bfe9b47df5d66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664f4e51f677aedd16cbcaed322e25d228c87936e30996f36dd3a5b97a43def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14f04556a2e11720ac3e92ec6f1c436468001e87b2984d007f9510a0e6ff0c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546efffc2572c644b7c949abadbaa2f8c9cc0449e08bb5615d3de0504f1bb9e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:23Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.120386 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://562dcc33d7454113e3dec145a8b2faba06d442892ac65301a5713c82e3ca73e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:23Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.131491 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sjg5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc362a8f-ced8-4af3-b908-f856e59845fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece7756234b7c8bc4d84fe1c8e1ed2c722cdcca266f4cd32d29cd5ff0df57b22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61c0fa498ca64b09ac096d8d7c72a5e2e6e95e2297e646ef2c6fd5bf0a0e60fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:33:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sjg5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:23Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.141938 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jn9jw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d43b03f-1f3f-4824-9eda-93e405f6ee9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgg5w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgg5w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jn9jw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:23Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.153200 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:23Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.164760 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:23Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.182153 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d680497dd4585ef08ee0f4b4ee45281ecdda251e7331302b9df8f5e4751727c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d680497dd4585ef08ee0f4b4ee45281ecdda251e7331302b9df8f5e4751727c1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T09:33:15Z\\\",\\\"message\\\":\\\"uilt lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-service-ca-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.40\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 09:33:15.499400 6373 services_controller.go:356] Processing sync for service openshift-apiserver-operator/metrics for network=default\\\\nI1002 09:33:15.498907 6373 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-jlvpl in node crc\\\\nI1002 09:33:15.499439 6373 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-jlvpl after 0 failed attempt(s)\\\\nI1002 09:33:15.499444 6373 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-jlvpl\\\\nF1002 09:33:15.498925 6373 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:33:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jlvpl_openshift-ovn-kubernetes(a900403a-9c51-4ca5-862f-2fbbd30cbcc3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jlvpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:23Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.197303 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3945e65-623a-40eb-82ae-8b17bcf8827b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497de5e5e05ba246ef4a053b62e09441e5e43342b0a61f446d7dabe2066b40b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cc2g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:23Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.203766 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.203817 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.203827 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.203844 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.203854 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:23Z","lastTransitionTime":"2025-10-02T09:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.208075 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tsdt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb2b1d65-2472-468b-8995-eb1943a63a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e98c56ccc33119647035cca69beb3a8def018e7bb0cae463c6910e18e0f64a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tsdt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:23Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.218385 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1756cb2-0822-4b79-b527-99dad76e6b19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4650b87cfdd4d4075b3c3c3a7417d8ae6c01222820fa783143fcf963904a9215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247e21bc25ecb430b611fbc0d7260465b80365a76340c4d89652a0e0affdc39a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09276f774f0a48df9df5b99cce34366b7113c49a0c9e8e269996127d93c6015d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a875abefffed9954368ae8e08d3101ae90708e53f82e7eaa447d6164e41a446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a875abefffed9954368ae8e08d3101ae90708e53f82e7eaa447d6164e41a446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:23Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.230654 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://059f5c846149b9005d20124d666d2964bccada1c171cb57e17c64555147b0ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ee5e5f4f56ae11d8f714c42b1c1e6044b04fa5cf4c353d6bfd08c9f3b9e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:23Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.243563 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a27c33baca44df058fd006a35ff882b94f64446256b26a7bcde4c22671a5ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:23Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.255561 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:23Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.265957 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f662826c-e772-4f56-a85f-83971aaef356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b262eb3201301d5e7cc940314a920dd22bcac685c0d91f412f27647028c7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc66c4529a0875a210fbc2b5af96db36ab208d651068958075de46898c1e150\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q6h76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:23Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.281607 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.281653 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.281664 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.281682 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.281692 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:23Z","lastTransitionTime":"2025-10-02T09:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:23 crc kubenswrapper[4722]: E1002 09:33:23.293249 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"680c7bd3-d5b0-4b5c-92c1-2fb03a628b44\\\",\\\"systemUUID\\\":\\\"8b109f7e-b539-4abd-be49-fd9e033b3339\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:23Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.297178 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.297228 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.297242 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.297261 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.297276 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:23Z","lastTransitionTime":"2025-10-02T09:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:23 crc kubenswrapper[4722]: E1002 09:33:23.308791 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"680c7bd3-d5b0-4b5c-92c1-2fb03a628b44\\\",\\\"systemUUID\\\":\\\"8b109f7e-b539-4abd-be49-fd9e033b3339\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:23Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.313827 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.313888 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.313905 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.313921 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.313931 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:23Z","lastTransitionTime":"2025-10-02T09:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:23 crc kubenswrapper[4722]: E1002 09:33:23.326386 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"680c7bd3-d5b0-4b5c-92c1-2fb03a628b44\\\",\\\"systemUUID\\\":\\\"8b109f7e-b539-4abd-be49-fd9e033b3339\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:23Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.331808 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.331882 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.331903 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.331935 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.331957 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:23Z","lastTransitionTime":"2025-10-02T09:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:23 crc kubenswrapper[4722]: E1002 09:33:23.348627 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"680c7bd3-d5b0-4b5c-92c1-2fb03a628b44\\\",\\\"systemUUID\\\":\\\"8b109f7e-b539-4abd-be49-fd9e033b3339\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:23Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.352569 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.352612 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.352624 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.352643 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.352657 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:23Z","lastTransitionTime":"2025-10-02T09:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:23 crc kubenswrapper[4722]: E1002 09:33:23.365022 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"680c7bd3-d5b0-4b5c-92c1-2fb03a628b44\\\",\\\"systemUUID\\\":\\\"8b109f7e-b539-4abd-be49-fd9e033b3339\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:23Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:23 crc kubenswrapper[4722]: E1002 09:33:23.365178 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.367418 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.367459 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.367468 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.367484 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.367495 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:23Z","lastTransitionTime":"2025-10-02T09:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.469814 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.469870 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.469887 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.469939 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.469953 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:23Z","lastTransitionTime":"2025-10-02T09:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.572092 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.572132 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.572141 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.572159 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.572171 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:23Z","lastTransitionTime":"2025-10-02T09:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.674240 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.674281 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.674289 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.674301 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.674310 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:23Z","lastTransitionTime":"2025-10-02T09:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.777406 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.777447 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.777456 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.777473 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.777483 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:23Z","lastTransitionTime":"2025-10-02T09:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.880113 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.880164 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.880176 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.880194 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.880207 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:23Z","lastTransitionTime":"2025-10-02T09:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.982363 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.982406 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.982417 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.982433 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:23 crc kubenswrapper[4722]: I1002 09:33:23.982445 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:23Z","lastTransitionTime":"2025-10-02T09:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.085903 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.085949 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.085957 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.085972 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.085982 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:24Z","lastTransitionTime":"2025-10-02T09:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.188207 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.188253 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.188265 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.188283 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.188297 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:24Z","lastTransitionTime":"2025-10-02T09:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.290073 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.290112 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.290123 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.290139 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.290151 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:24Z","lastTransitionTime":"2025-10-02T09:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.392339 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.392382 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.392390 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.392404 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.392413 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:24Z","lastTransitionTime":"2025-10-02T09:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.495369 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.495407 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.495418 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.495432 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.495443 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:24Z","lastTransitionTime":"2025-10-02T09:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.598070 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.598116 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.598128 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.598146 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.598157 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:24Z","lastTransitionTime":"2025-10-02T09:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.700760 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.700814 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.700826 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.700844 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.700857 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:24Z","lastTransitionTime":"2025-10-02T09:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.709340 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.709356 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.709356 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.709364 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:33:24 crc kubenswrapper[4722]: E1002 09:33:24.709500 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:33:24 crc kubenswrapper[4722]: E1002 09:33:24.709563 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:33:24 crc kubenswrapper[4722]: E1002 09:33:24.709683 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:33:24 crc kubenswrapper[4722]: E1002 09:33:24.709783 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.722175 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zwlns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04c84986-75a1-4683-8764-8f7307d1126a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460204a345893172e65dd3415be30a96dd48d7bce44d05971133cde3e43939af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghrmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zwlns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:24Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.734824 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a56e3a0-fea1-4ca5-9d2a-b0a28e16d684\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://737b3f37862e455f181b6864cab54bb6dae5c458ed9dcb9b7fda63a06d0f17fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb6cb31ce28cc0d6095e7f27187bdeb1c2ee2d3f19ac480c95ab8116bd776bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01dd91984bea279379c038d2fbb4a4ad45200b49443071e290c0bf6854af1ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df1cccd0fca2c109ddcdd7a8dbbdeda8524b30c01769c733e4ac6ffb6fde621d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 09:32:46.368533 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 09:32:46.368793 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 09:32:46.369600 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1554788403/tls.crt::/tmp/serving-cert-1554788403/tls.key\\\\\\\"\\\\nI1002 09:32:46.682405 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 09:32:46.685049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 09:32:46.685073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 09:32:46.685132 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 09:32:46.685139 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 09:32:46.690261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 09:32:46.690286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 09:32:46.690299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 09:32:46.690303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 09:32:46.690306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 09:32:46.690478 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 09:32:46.692277 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3b1c06b43a4fb138588fe71ad3f37dae41d5de647684ab490a2f4079018ac06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:24Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.744232 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hm66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9384fcdadf3735b0de65edc5af7e3d88b55d03043291aea0bc8cc8de945e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7k768\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hm66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:24Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.753150 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sjg5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc362a8f-ced8-4af3-b908-f856e59845fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece7756234b7c8bc4d84fe1c8e1ed2c722cdcca266f4cd32d29cd5ff0df57b22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61c0fa498ca64b09ac096d8d7c72a5e2e6e95e2297e646ef2c6fd5bf0a0e60fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:33:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sjg5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:24Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.763768 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jn9jw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d43b03f-1f3f-4824-9eda-93e405f6ee9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgg5w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgg5w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jn9jw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:24Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.774293 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8589e9ec-e8ad-4b32-8846-841b689971b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf821bcfda8c6b7f9823ecb4a9dc7844eae7b1005d41e7bfe9b47df5d66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664f4e51f677aedd16cbcaed322e25d228c87936e30996f36dd3a5b97a43def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14f04556a2e11720ac3e92ec6f1c436468001e87b2984d007f9510a0e6ff0c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546efffc2572c644b7c949abadbaa2f8c9cc0449e08bb5615d3de0504f1bb9e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:24Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.787156 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://562dcc33d7454113e3dec145a8b2faba06d442892ac65301a5713c82e3ca73e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:24Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.802283 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:24Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.803187 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.803216 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.803226 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.803241 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.803289 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:24Z","lastTransitionTime":"2025-10-02T09:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.818031 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d680497dd4585ef08ee0f4b4ee45281ecdda251e7331302b9df8f5e4751727c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d680497dd4585ef08ee0f4b4ee45281ecdda251e7331302b9df8f5e4751727c1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T09:33:15Z\\\",\\\"message\\\":\\\"uilt lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-service-ca-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.40\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 09:33:15.499400 6373 services_controller.go:356] Processing sync for service openshift-apiserver-operator/metrics for network=default\\\\nI1002 09:33:15.498907 6373 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-jlvpl in node crc\\\\nI1002 09:33:15.499439 6373 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-jlvpl after 0 failed attempt(s)\\\\nI1002 09:33:15.499444 6373 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-jlvpl\\\\nF1002 09:33:15.498925 6373 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:33:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jlvpl_openshift-ovn-kubernetes(a900403a-9c51-4ca5-862f-2fbbd30cbcc3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jlvpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:24Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.831676 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3945e65-623a-40eb-82ae-8b17bcf8827b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497de5e5e05ba246ef4a053b62e09441e5e43342b0a61f446d7dabe2066b40b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cc2g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:24Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.842180 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:24Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.854752 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://059f5c846149b9005d20124d666d2964bccada1c171cb57e17c64555147b0ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ee5e5f4f56ae11d8f714c42b1c1e6044b04fa5cf4c353d6bfd08c9f3b9e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:24Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.864624 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a27c33baca44df058fd006a35ff882b94f64446256b26a7bcde4c22671a5ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:24Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.876573 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:24Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.885879 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f662826c-e772-4f56-a85f-83971aaef356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b262eb3201301d5e7cc940314a920dd22bcac685c0d91f412f27647028c7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc66c4529a0875a210fbc2b5af96db36ab208d651068958075de46898c1e150\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q6h76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:24Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.894761 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tsdt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb2b1d65-2472-468b-8995-eb1943a63a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e98c56ccc33119647035cca69beb3a8def018e7bb0cae463c6910e18e0f64a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tsdt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:24Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.905241 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1756cb2-0822-4b79-b527-99dad76e6b19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4650b87cfdd4d4075b3c3c3a7417d8ae6c01222820fa783143fcf963904a9215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247e21bc25ecb430b611fbc0d7260465b80365a76340c4d89652a0e0affdc39a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09276f774f0a48df9df5b99cce34366b7113c49a0c9e8e269996127d93c6015d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a875abefffed9954368ae8e08d3101ae90708e53f82e7eaa447d6164e41a446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a875abefffed9954368ae8e08d3101ae90708e53f82e7eaa447d6164e41a446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:24Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.905532 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.905587 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.905601 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.905619 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:24 crc kubenswrapper[4722]: I1002 09:33:24.905632 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:24Z","lastTransitionTime":"2025-10-02T09:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.007937 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.008040 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.008054 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.008078 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.008090 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:25Z","lastTransitionTime":"2025-10-02T09:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.112140 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.112200 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.112213 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.112236 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.112249 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:25Z","lastTransitionTime":"2025-10-02T09:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.215780 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.215863 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.215888 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.215923 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.215943 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:25Z","lastTransitionTime":"2025-10-02T09:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.318126 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.318181 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.318192 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.318213 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.318224 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:25Z","lastTransitionTime":"2025-10-02T09:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.420563 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.420606 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.420617 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.420632 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.420641 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:25Z","lastTransitionTime":"2025-10-02T09:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.524680 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.524723 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.524732 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.524748 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.524761 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:25Z","lastTransitionTime":"2025-10-02T09:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.631343 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.631387 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.631395 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.631409 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.631418 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:25Z","lastTransitionTime":"2025-10-02T09:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.733121 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.733153 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.733161 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.733173 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.733184 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:25Z","lastTransitionTime":"2025-10-02T09:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.836191 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.836233 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.836242 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.836258 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.836271 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:25Z","lastTransitionTime":"2025-10-02T09:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.938574 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.938676 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.938688 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.938706 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:25 crc kubenswrapper[4722]: I1002 09:33:25.938720 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:25Z","lastTransitionTime":"2025-10-02T09:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.041670 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.041742 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.041755 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.041779 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.041796 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:26Z","lastTransitionTime":"2025-10-02T09:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.146799 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.146887 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.146903 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.146923 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.146934 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:26Z","lastTransitionTime":"2025-10-02T09:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.249506 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.249550 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.249565 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.249582 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.249592 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:26Z","lastTransitionTime":"2025-10-02T09:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.351879 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.351920 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.351932 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.351950 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.351962 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:26Z","lastTransitionTime":"2025-10-02T09:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.454075 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.454110 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.454117 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.454131 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.454139 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:26Z","lastTransitionTime":"2025-10-02T09:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.556661 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.556717 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.556726 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.556739 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.556747 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:26Z","lastTransitionTime":"2025-10-02T09:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.659178 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.659219 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.659232 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.659252 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.659264 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:26Z","lastTransitionTime":"2025-10-02T09:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.710134 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.710176 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.710134 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:33:26 crc kubenswrapper[4722]: E1002 09:33:26.710284 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.710456 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:33:26 crc kubenswrapper[4722]: E1002 09:33:26.710546 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:33:26 crc kubenswrapper[4722]: E1002 09:33:26.710586 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:33:26 crc kubenswrapper[4722]: E1002 09:33:26.710756 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.761247 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.761295 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.761346 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.761364 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.761375 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:26Z","lastTransitionTime":"2025-10-02T09:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.863750 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.863790 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.863800 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.863816 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.863826 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:26Z","lastTransitionTime":"2025-10-02T09:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.967239 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.967279 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.967287 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.967300 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:26 crc kubenswrapper[4722]: I1002 09:33:26.967308 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:26Z","lastTransitionTime":"2025-10-02T09:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.069533 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.069577 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.069587 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.069605 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.069619 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:27Z","lastTransitionTime":"2025-10-02T09:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.172591 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.172630 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.172645 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.172663 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.172676 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:27Z","lastTransitionTime":"2025-10-02T09:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.274516 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.274576 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.274593 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.274609 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.274622 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:27Z","lastTransitionTime":"2025-10-02T09:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.377036 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.377273 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.377283 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.377300 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.377325 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:27Z","lastTransitionTime":"2025-10-02T09:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.480140 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.480182 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.480190 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.480205 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.480213 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:27Z","lastTransitionTime":"2025-10-02T09:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.582697 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.582736 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.582744 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.582759 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.582769 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:27Z","lastTransitionTime":"2025-10-02T09:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.685187 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.685215 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.685223 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.685236 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.685245 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:27Z","lastTransitionTime":"2025-10-02T09:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.789143 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.789285 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.789301 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.789331 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.789341 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:27Z","lastTransitionTime":"2025-10-02T09:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.892497 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.892540 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.892549 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.892563 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.892574 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:27Z","lastTransitionTime":"2025-10-02T09:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.995336 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.995394 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.995406 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.995428 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:27 crc kubenswrapper[4722]: I1002 09:33:27.995440 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:27Z","lastTransitionTime":"2025-10-02T09:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.098417 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.098490 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.098512 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.098545 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.098572 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:28Z","lastTransitionTime":"2025-10-02T09:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.201825 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.201876 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.201888 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.201904 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.201913 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:28Z","lastTransitionTime":"2025-10-02T09:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.305674 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.305713 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.305722 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.305737 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.305748 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:28Z","lastTransitionTime":"2025-10-02T09:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.409269 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.409715 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.409726 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.409743 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.409754 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:28Z","lastTransitionTime":"2025-10-02T09:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.513140 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.513182 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.513197 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.513216 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.513226 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:28Z","lastTransitionTime":"2025-10-02T09:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.615356 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.615387 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.615395 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.615409 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.615419 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:28Z","lastTransitionTime":"2025-10-02T09:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.710064 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.710127 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.710135 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:33:28 crc kubenswrapper[4722]: E1002 09:33:28.710188 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.710252 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:33:28 crc kubenswrapper[4722]: E1002 09:33:28.710309 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:33:28 crc kubenswrapper[4722]: E1002 09:33:28.710471 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:33:28 crc kubenswrapper[4722]: E1002 09:33:28.710563 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.717984 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.718013 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.718024 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.718039 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.718048 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:28Z","lastTransitionTime":"2025-10-02T09:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.821239 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.821269 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.821278 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.821291 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.821301 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:28Z","lastTransitionTime":"2025-10-02T09:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.923545 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.923589 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.923598 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.923613 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:28 crc kubenswrapper[4722]: I1002 09:33:28.923626 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:28Z","lastTransitionTime":"2025-10-02T09:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.026442 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.026784 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.026861 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.026941 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.027066 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:29Z","lastTransitionTime":"2025-10-02T09:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.129406 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.129976 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.130059 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.130145 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.130234 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:29Z","lastTransitionTime":"2025-10-02T09:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.233176 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.233504 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.233572 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.233640 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.233705 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:29Z","lastTransitionTime":"2025-10-02T09:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.336232 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.336922 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.337178 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.337280 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.337401 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:29Z","lastTransitionTime":"2025-10-02T09:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.439953 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.440014 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.440031 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.440057 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.440072 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:29Z","lastTransitionTime":"2025-10-02T09:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.543030 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.543076 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.543090 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.543109 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.543122 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:29Z","lastTransitionTime":"2025-10-02T09:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.645835 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.645885 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.645896 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.645915 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.645929 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:29Z","lastTransitionTime":"2025-10-02T09:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.749992 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.750033 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.750044 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.750091 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.750105 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:29Z","lastTransitionTime":"2025-10-02T09:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.852859 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.852910 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.852921 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.852942 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.852954 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:29Z","lastTransitionTime":"2025-10-02T09:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.956530 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.956936 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.957042 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.957134 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:29 crc kubenswrapper[4722]: I1002 09:33:29.957225 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:29Z","lastTransitionTime":"2025-10-02T09:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.060376 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.060426 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.060436 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.060452 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.060465 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:30Z","lastTransitionTime":"2025-10-02T09:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.162711 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.162997 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.163079 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.163205 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.163301 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:30Z","lastTransitionTime":"2025-10-02T09:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.267196 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.267253 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.267273 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.267290 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.267301 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:30Z","lastTransitionTime":"2025-10-02T09:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.369576 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.370398 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.370482 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.370563 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.370638 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:30Z","lastTransitionTime":"2025-10-02T09:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.473441 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.473481 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.473492 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.473508 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.473519 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:30Z","lastTransitionTime":"2025-10-02T09:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.576101 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.576435 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.576522 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.576591 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.576658 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:30Z","lastTransitionTime":"2025-10-02T09:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.679561 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.679618 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.679648 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.679665 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.679676 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:30Z","lastTransitionTime":"2025-10-02T09:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.710412 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.710667 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.710874 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:33:30 crc kubenswrapper[4722]: E1002 09:33:30.711199 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:33:30 crc kubenswrapper[4722]: E1002 09:33:30.711300 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:33:30 crc kubenswrapper[4722]: E1002 09:33:30.711411 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.711697 4722 scope.go:117] "RemoveContainer" containerID="d680497dd4585ef08ee0f4b4ee45281ecdda251e7331302b9df8f5e4751727c1" Oct 02 09:33:30 crc kubenswrapper[4722]: E1002 09:33:30.711967 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jlvpl_openshift-ovn-kubernetes(a900403a-9c51-4ca5-862f-2fbbd30cbcc3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.712271 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:33:30 crc kubenswrapper[4722]: E1002 09:33:30.714114 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.723466 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.781846 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.782166 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.782254 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.782398 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.782509 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:30Z","lastTransitionTime":"2025-10-02T09:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.885119 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.885178 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.885191 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.885219 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.885233 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:30Z","lastTransitionTime":"2025-10-02T09:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.988369 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.988411 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.988420 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.988436 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:30 crc kubenswrapper[4722]: I1002 09:33:30.988447 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:30Z","lastTransitionTime":"2025-10-02T09:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:31 crc kubenswrapper[4722]: I1002 09:33:31.090962 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:31 crc kubenswrapper[4722]: I1002 09:33:31.091009 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:31 crc kubenswrapper[4722]: I1002 09:33:31.091020 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:31 crc kubenswrapper[4722]: I1002 09:33:31.091044 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:31 crc kubenswrapper[4722]: I1002 09:33:31.091057 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:31Z","lastTransitionTime":"2025-10-02T09:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:31 crc kubenswrapper[4722]: I1002 09:33:31.194648 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:31 crc kubenswrapper[4722]: I1002 09:33:31.194723 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:31 crc kubenswrapper[4722]: I1002 09:33:31.194741 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:31 crc kubenswrapper[4722]: I1002 09:33:31.194770 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:31 crc kubenswrapper[4722]: I1002 09:33:31.194790 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:31Z","lastTransitionTime":"2025-10-02T09:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:31 crc kubenswrapper[4722]: I1002 09:33:31.298141 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:31 crc kubenswrapper[4722]: I1002 09:33:31.298195 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:31 crc kubenswrapper[4722]: I1002 09:33:31.298205 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:31 crc kubenswrapper[4722]: I1002 09:33:31.298221 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:31 crc kubenswrapper[4722]: I1002 09:33:31.298233 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:31Z","lastTransitionTime":"2025-10-02T09:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:31 crc kubenswrapper[4722]: I1002 09:33:31.400692 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:31 crc kubenswrapper[4722]: I1002 09:33:31.400734 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:31 crc kubenswrapper[4722]: I1002 09:33:31.400742 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:31 crc kubenswrapper[4722]: I1002 09:33:31.400759 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:31 crc kubenswrapper[4722]: I1002 09:33:31.400769 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:31Z","lastTransitionTime":"2025-10-02T09:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:31 crc kubenswrapper[4722]: I1002 09:33:31.503470 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:31 crc kubenswrapper[4722]: I1002 09:33:31.503826 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:31 crc kubenswrapper[4722]: I1002 09:33:31.503950 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:31 crc kubenswrapper[4722]: I1002 09:33:31.504045 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:31 crc kubenswrapper[4722]: I1002 09:33:31.504137 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:31Z","lastTransitionTime":"2025-10-02T09:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:31 crc kubenswrapper[4722]: I1002 09:33:31.606888 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:31 crc kubenswrapper[4722]: I1002 09:33:31.606950 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:31 crc kubenswrapper[4722]: I1002 09:33:31.606962 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:31 crc kubenswrapper[4722]: I1002 09:33:31.606978 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:31 crc kubenswrapper[4722]: I1002 09:33:31.606987 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:31Z","lastTransitionTime":"2025-10-02T09:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:31 crc kubenswrapper[4722]: I1002 09:33:31.710149 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:31 crc kubenswrapper[4722]: I1002 09:33:31.710430 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:31 crc kubenswrapper[4722]: I1002 09:33:31.710591 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:31 crc kubenswrapper[4722]: I1002 09:33:31.710877 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:31 crc kubenswrapper[4722]: I1002 09:33:31.710959 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:31Z","lastTransitionTime":"2025-10-02T09:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:31 crc kubenswrapper[4722]: I1002 09:33:31.813150 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:31 crc kubenswrapper[4722]: I1002 09:33:31.813505 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:31 crc kubenswrapper[4722]: I1002 09:33:31.813640 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:31 crc kubenswrapper[4722]: I1002 09:33:31.813745 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:31 crc kubenswrapper[4722]: I1002 09:33:31.813833 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:31Z","lastTransitionTime":"2025-10-02T09:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:31 crc kubenswrapper[4722]: I1002 09:33:31.916134 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:31 crc kubenswrapper[4722]: I1002 09:33:31.916205 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:31 crc kubenswrapper[4722]: I1002 09:33:31.916219 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:31 crc kubenswrapper[4722]: I1002 09:33:31.916248 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:31 crc kubenswrapper[4722]: I1002 09:33:31.916264 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:31Z","lastTransitionTime":"2025-10-02T09:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.023339 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.023451 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.023469 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.023633 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.024014 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:32Z","lastTransitionTime":"2025-10-02T09:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.126408 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.126453 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.126465 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.126485 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.126498 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:32Z","lastTransitionTime":"2025-10-02T09:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.229189 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.229234 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.229247 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.229264 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.229274 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:32Z","lastTransitionTime":"2025-10-02T09:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.331455 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.331493 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.331517 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.331534 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.331542 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:32Z","lastTransitionTime":"2025-10-02T09:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.434252 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.434293 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.434306 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.434342 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.434352 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:32Z","lastTransitionTime":"2025-10-02T09:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.536727 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.536784 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.536799 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.536818 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.536830 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:32Z","lastTransitionTime":"2025-10-02T09:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.639216 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.639303 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.639343 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.639374 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.639389 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:32Z","lastTransitionTime":"2025-10-02T09:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.710128 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.710196 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.710153 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.710362 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:33:32 crc kubenswrapper[4722]: E1002 09:33:32.710573 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:33:32 crc kubenswrapper[4722]: E1002 09:33:32.710886 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:33:32 crc kubenswrapper[4722]: E1002 09:33:32.711056 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:33:32 crc kubenswrapper[4722]: E1002 09:33:32.711186 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.742375 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.742420 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.742431 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.742450 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.742462 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:32Z","lastTransitionTime":"2025-10-02T09:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.844816 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.844880 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.844891 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.844961 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.844978 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:32Z","lastTransitionTime":"2025-10-02T09:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.948682 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.948745 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.948756 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.948783 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:32 crc kubenswrapper[4722]: I1002 09:33:32.948805 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:32Z","lastTransitionTime":"2025-10-02T09:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.052077 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.052121 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.052138 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.052157 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.052172 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:33Z","lastTransitionTime":"2025-10-02T09:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.155039 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.155073 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.155082 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.155094 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.155105 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:33Z","lastTransitionTime":"2025-10-02T09:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.257457 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.257496 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.257505 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.257519 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.257530 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:33Z","lastTransitionTime":"2025-10-02T09:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.360563 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.360612 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.360628 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.360648 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.360662 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:33Z","lastTransitionTime":"2025-10-02T09:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.463640 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.463722 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.463735 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.463748 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.463758 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:33Z","lastTransitionTime":"2025-10-02T09:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.475204 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.475285 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.475299 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.475338 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.475401 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:33Z","lastTransitionTime":"2025-10-02T09:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:33 crc kubenswrapper[4722]: E1002 09:33:33.491977 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"680c7bd3-d5b0-4b5c-92c1-2fb03a628b44\\\",\\\"systemUUID\\\":\\\"8b109f7e-b539-4abd-be49-fd9e033b3339\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:33Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.496156 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.496200 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.496209 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.496224 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.496234 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:33Z","lastTransitionTime":"2025-10-02T09:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:33 crc kubenswrapper[4722]: E1002 09:33:33.539028 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"680c7bd3-d5b0-4b5c-92c1-2fb03a628b44\\\",\\\"systemUUID\\\":\\\"8b109f7e-b539-4abd-be49-fd9e033b3339\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:33Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.546770 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.546824 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.546836 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.546849 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.546860 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:33Z","lastTransitionTime":"2025-10-02T09:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:33 crc kubenswrapper[4722]: E1002 09:33:33.572512 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"680c7bd3-d5b0-4b5c-92c1-2fb03a628b44\\\",\\\"systemUUID\\\":\\\"8b109f7e-b539-4abd-be49-fd9e033b3339\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:33Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.577160 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.577237 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.577252 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.577284 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.577300 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:33Z","lastTransitionTime":"2025-10-02T09:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:33 crc kubenswrapper[4722]: E1002 09:33:33.594575 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"680c7bd3-d5b0-4b5c-92c1-2fb03a628b44\\\",\\\"systemUUID\\\":\\\"8b109f7e-b539-4abd-be49-fd9e033b3339\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:33Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.608156 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.608202 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.608214 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.608231 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.608242 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:33Z","lastTransitionTime":"2025-10-02T09:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:33 crc kubenswrapper[4722]: E1002 09:33:33.627595 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"680c7bd3-d5b0-4b5c-92c1-2fb03a628b44\\\",\\\"systemUUID\\\":\\\"8b109f7e-b539-4abd-be49-fd9e033b3339\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:33Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:33 crc kubenswrapper[4722]: E1002 09:33:33.627756 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.629591 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.629626 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.629637 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.629652 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.629663 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:33Z","lastTransitionTime":"2025-10-02T09:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.731972 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.732022 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.732031 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.732050 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.732063 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:33Z","lastTransitionTime":"2025-10-02T09:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.835274 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.835338 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.835353 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.835394 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.835406 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:33Z","lastTransitionTime":"2025-10-02T09:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.938188 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.938233 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.938283 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.938304 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:33 crc kubenswrapper[4722]: I1002 09:33:33.938333 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:33Z","lastTransitionTime":"2025-10-02T09:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.041648 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.041688 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.041697 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.041711 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.041721 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:34Z","lastTransitionTime":"2025-10-02T09:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.145603 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.145654 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.145668 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.145690 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.145710 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:34Z","lastTransitionTime":"2025-10-02T09:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.210235 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3d43b03f-1f3f-4824-9eda-93e405f6ee9e-metrics-certs\") pod \"network-metrics-daemon-jn9jw\" (UID: \"3d43b03f-1f3f-4824-9eda-93e405f6ee9e\") " pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:33:34 crc kubenswrapper[4722]: E1002 09:33:34.210411 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 09:33:34 crc kubenswrapper[4722]: E1002 09:33:34.210482 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d43b03f-1f3f-4824-9eda-93e405f6ee9e-metrics-certs podName:3d43b03f-1f3f-4824-9eda-93e405f6ee9e nodeName:}" failed. No retries permitted until 2025-10-02 09:34:06.210462543 +0000 UTC m=+102.247215523 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3d43b03f-1f3f-4824-9eda-93e405f6ee9e-metrics-certs") pod "network-metrics-daemon-jn9jw" (UID: "3d43b03f-1f3f-4824-9eda-93e405f6ee9e") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.249129 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.249203 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.249218 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.249240 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.249255 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:34Z","lastTransitionTime":"2025-10-02T09:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.351979 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.352031 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.352045 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.352065 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.352080 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:34Z","lastTransitionTime":"2025-10-02T09:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.455106 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.455186 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.455198 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.455219 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.455232 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:34Z","lastTransitionTime":"2025-10-02T09:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.558167 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.558240 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.558263 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.558301 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.558388 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:34Z","lastTransitionTime":"2025-10-02T09:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.661776 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.661845 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.661865 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.661898 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.661922 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:34Z","lastTransitionTime":"2025-10-02T09:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.709800 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.709856 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.709879 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:33:34 crc kubenswrapper[4722]: E1002 09:33:34.710023 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.710058 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:33:34 crc kubenswrapper[4722]: E1002 09:33:34.710198 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:33:34 crc kubenswrapper[4722]: E1002 09:33:34.710447 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:33:34 crc kubenswrapper[4722]: E1002 09:33:34.710551 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.723278 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8589e9ec-e8ad-4b32-8846-841b689971b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf821bcfda8c6b7f9823ecb4a9dc7844eae7b1005d41e7bfe9b47df5d66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664f4e51f677aedd16cbcaed322e25d228c87936e30996f36dd3a5b97a43def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14f04556a2e11720ac3e92ec6f1c436468001e87b2984d007f9510a0e6ff0c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546efffc2572c644b7c949abadbaa2f8c9cc0449e08bb5615d3de0504f1bb9e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:34Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.744414 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://562dcc33d7454113e3dec145a8b2faba06d442892ac65301a5713c82e3ca73e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:34Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.758549 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sjg5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc362a8f-ced8-4af3-b908-f856e59845fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece7756234b7c8bc4d84fe1c8e1ed2c722cdcca266f4cd32d29cd5ff0df57b22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61c0fa498ca64b09ac096d8d7c72a5e2e6e95e2297e646ef2c6fd5bf0a0e60fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:33:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sjg5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:34Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.765455 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.765518 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.765532 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.765561 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.765580 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:34Z","lastTransitionTime":"2025-10-02T09:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.773072 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jn9jw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d43b03f-1f3f-4824-9eda-93e405f6ee9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgg5w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgg5w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jn9jw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:34Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.791558 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3945e65-623a-40eb-82ae-8b17bcf8827b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497de5e5e05ba246ef4a053b62e09441e5e43342b0a61f446d7dabe2066b40b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cc2g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:34Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.809050 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:34Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.823353 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:34Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.847527 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d680497dd4585ef08ee0f4b4ee45281ecdda251e7331302b9df8f5e4751727c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d680497dd4585ef08ee0f4b4ee45281ecdda251e7331302b9df8f5e4751727c1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T09:33:15Z\\\",\\\"message\\\":\\\"uilt lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-service-ca-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.40\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 09:33:15.499400 6373 services_controller.go:356] Processing sync for service openshift-apiserver-operator/metrics for network=default\\\\nI1002 09:33:15.498907 6373 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-jlvpl in node crc\\\\nI1002 09:33:15.499439 6373 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-jlvpl after 0 failed attempt(s)\\\\nI1002 09:33:15.499444 6373 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-jlvpl\\\\nF1002 09:33:15.498925 6373 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:33:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jlvpl_openshift-ovn-kubernetes(a900403a-9c51-4ca5-862f-2fbbd30cbcc3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jlvpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:34Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.862590 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:34Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.868938 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.868972 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.868980 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.868993 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.869011 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:34Z","lastTransitionTime":"2025-10-02T09:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.874690 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f662826c-e772-4f56-a85f-83971aaef356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b262eb3201301d5e7cc940314a920dd22bcac685c0d91f412f27647028c7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc66c4529a0875a210fbc2b5af96db36ab208d651068958075de46898c1e150\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q6h76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:34Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.885530 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tsdt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb2b1d65-2472-468b-8995-eb1943a63a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e98c56ccc33119647035cca69beb3a8def018e7bb0cae463c6910e18e0f64a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tsdt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:34Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.899369 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1756cb2-0822-4b79-b527-99dad76e6b19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4650b87cfdd4d4075b3c3c3a7417d8ae6c01222820fa783143fcf963904a9215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247e21bc25ecb430b611fbc0d7260465b80365a76340c4d89652a0e0affdc39a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09276f774f0a48df9df5b99cce34366b7113c49a0c9e8e269996127d93c6015d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a875abefffed9954368ae8e08d3101ae90708e53f82e7eaa447d6164e41a446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a875abefffed9954368ae8e08d3101ae90708e53f82e7eaa447d6164e41a446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:34Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.913340 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dec5204-e2c5-43eb-8b80-80b74c9aa806\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beaa5e380cbd961e9e06c7aad0f3ff2f42c43eab5e89b6cf779003c490baf018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1be9f4d5574e381c95d2f7e44f20e96db3e586f66a380f79ab89422112bbfb35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1be9f4d5574e381c95d2f7e44f20e96db3e586f66a380f79ab89422112bbfb35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:34Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.931933 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://059f5c846149b9005d20124d666d2964bccada1c171cb57e17c64555147b0ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ee5e5f4f56ae11d8f714c42b1c1e6044b04fa5cf4c353d6bfd08c9f3b9e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:34Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.949001 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a27c33baca44df058fd006a35ff882b94f64446256b26a7bcde4c22671a5ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:34Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.965301 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a56e3a0-fea1-4ca5-9d2a-b0a28e16d684\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://737b3f37862e455f181b6864cab54bb6dae5c458ed9dcb9b7fda63a06d0f17fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb6cb31ce28cc0d6095e7f27187bdeb1c2ee2d3f19ac480c95ab8116bd776bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01dd91984bea279379c038d2fbb4a4ad45200b49443071e290c0bf6854af1ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df1cccd0fca2c109ddcdd7a8dbbdeda8524b30c01769c733e4ac6ffb6fde621d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 09:32:46.368533 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 09:32:46.368793 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 09:32:46.369600 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1554788403/tls.crt::/tmp/serving-cert-1554788403/tls.key\\\\\\\"\\\\nI1002 09:32:46.682405 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 09:32:46.685049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 09:32:46.685073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 09:32:46.685132 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 09:32:46.685139 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 09:32:46.690261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 09:32:46.690286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 09:32:46.690299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 09:32:46.690303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 09:32:46.690306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 09:32:46.690478 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 09:32:46.692277 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3b1c06b43a4fb138588fe71ad3f37dae41d5de647684ab490a2f4079018ac06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:34Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.976127 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.976175 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.976188 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.976209 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.976224 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:34Z","lastTransitionTime":"2025-10-02T09:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.980179 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hm66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9384fcdadf3735b0de65edc5af7e3d88b55d03043291aea0bc8cc8de945e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7k768\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hm66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:34Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:34 crc kubenswrapper[4722]: I1002 09:33:34.996031 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zwlns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04c84986-75a1-4683-8764-8f7307d1126a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460204a345893172e65dd3415be30a96dd48d7bce44d05971133cde3e43939af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghrmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zwlns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:34Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:35 crc kubenswrapper[4722]: I1002 09:33:35.079361 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:35 crc kubenswrapper[4722]: I1002 09:33:35.079419 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:35 crc kubenswrapper[4722]: I1002 09:33:35.079432 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:35 crc kubenswrapper[4722]: I1002 09:33:35.079452 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:35 crc kubenswrapper[4722]: I1002 09:33:35.079465 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:35Z","lastTransitionTime":"2025-10-02T09:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:35 crc kubenswrapper[4722]: I1002 09:33:35.182131 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:35 crc kubenswrapper[4722]: I1002 09:33:35.182185 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:35 crc kubenswrapper[4722]: I1002 09:33:35.182195 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:35 crc kubenswrapper[4722]: I1002 09:33:35.182219 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:35 crc kubenswrapper[4722]: I1002 09:33:35.182232 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:35Z","lastTransitionTime":"2025-10-02T09:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:35 crc kubenswrapper[4722]: I1002 09:33:35.284700 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:35 crc kubenswrapper[4722]: I1002 09:33:35.284734 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:35 crc kubenswrapper[4722]: I1002 09:33:35.284742 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:35 crc kubenswrapper[4722]: I1002 09:33:35.284756 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:35 crc kubenswrapper[4722]: I1002 09:33:35.284764 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:35Z","lastTransitionTime":"2025-10-02T09:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:35 crc kubenswrapper[4722]: I1002 09:33:35.388221 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:35 crc kubenswrapper[4722]: I1002 09:33:35.388275 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:35 crc kubenswrapper[4722]: I1002 09:33:35.388287 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:35 crc kubenswrapper[4722]: I1002 09:33:35.388331 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:35 crc kubenswrapper[4722]: I1002 09:33:35.388350 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:35Z","lastTransitionTime":"2025-10-02T09:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:35 crc kubenswrapper[4722]: I1002 09:33:35.491437 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:35 crc kubenswrapper[4722]: I1002 09:33:35.491474 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:35 crc kubenswrapper[4722]: I1002 09:33:35.491482 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:35 crc kubenswrapper[4722]: I1002 09:33:35.491497 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:35 crc kubenswrapper[4722]: I1002 09:33:35.491507 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:35Z","lastTransitionTime":"2025-10-02T09:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:35 crc kubenswrapper[4722]: I1002 09:33:35.594708 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:35 crc kubenswrapper[4722]: I1002 09:33:35.594748 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:35 crc kubenswrapper[4722]: I1002 09:33:35.594759 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:35 crc kubenswrapper[4722]: I1002 09:33:35.594777 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:35 crc kubenswrapper[4722]: I1002 09:33:35.594789 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:35Z","lastTransitionTime":"2025-10-02T09:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:35 crc kubenswrapper[4722]: I1002 09:33:35.697475 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:35 crc kubenswrapper[4722]: I1002 09:33:35.697532 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:35 crc kubenswrapper[4722]: I1002 09:33:35.697546 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:35 crc kubenswrapper[4722]: I1002 09:33:35.697562 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:35 crc kubenswrapper[4722]: I1002 09:33:35.697574 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:35Z","lastTransitionTime":"2025-10-02T09:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:35 crc kubenswrapper[4722]: I1002 09:33:35.800286 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:35 crc kubenswrapper[4722]: I1002 09:33:35.800343 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:35 crc kubenswrapper[4722]: I1002 09:33:35.800352 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:35 crc kubenswrapper[4722]: I1002 09:33:35.800368 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:35 crc kubenswrapper[4722]: I1002 09:33:35.800378 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:35Z","lastTransitionTime":"2025-10-02T09:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:35 crc kubenswrapper[4722]: I1002 09:33:35.903761 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:35 crc kubenswrapper[4722]: I1002 09:33:35.903813 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:35 crc kubenswrapper[4722]: I1002 09:33:35.903824 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:35 crc kubenswrapper[4722]: I1002 09:33:35.903846 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:35 crc kubenswrapper[4722]: I1002 09:33:35.903858 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:35Z","lastTransitionTime":"2025-10-02T09:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.008063 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.008117 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.008126 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.008148 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.008162 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:36Z","lastTransitionTime":"2025-10-02T09:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.111438 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.111495 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.111510 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.111530 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.111541 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:36Z","lastTransitionTime":"2025-10-02T09:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.215699 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.215747 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.215759 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.215781 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.215795 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:36Z","lastTransitionTime":"2025-10-02T09:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.318501 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.318530 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.318538 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.318551 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.318559 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:36Z","lastTransitionTime":"2025-10-02T09:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.421433 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.421514 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.421524 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.421543 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.421554 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:36Z","lastTransitionTime":"2025-10-02T09:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.524355 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.524418 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.524430 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.524446 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.524458 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:36Z","lastTransitionTime":"2025-10-02T09:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.627220 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.627293 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.627306 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.627376 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.627387 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:36Z","lastTransitionTime":"2025-10-02T09:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.709805 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.709844 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.710007 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:33:36 crc kubenswrapper[4722]: E1002 09:33:36.710073 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:33:36 crc kubenswrapper[4722]: E1002 09:33:36.710560 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.710606 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:33:36 crc kubenswrapper[4722]: E1002 09:33:36.710768 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:33:36 crc kubenswrapper[4722]: E1002 09:33:36.710849 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.731213 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.731275 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.731288 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.731349 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.731366 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:36Z","lastTransitionTime":"2025-10-02T09:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.833804 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.833901 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.833919 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.833942 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.833957 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:36Z","lastTransitionTime":"2025-10-02T09:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.936668 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.936792 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.936804 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.936821 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:36 crc kubenswrapper[4722]: I1002 09:33:36.936829 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:36Z","lastTransitionTime":"2025-10-02T09:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.039971 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.040024 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.040034 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.040053 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.040066 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:37Z","lastTransitionTime":"2025-10-02T09:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.143778 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.144095 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.144109 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.144126 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.144136 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:37Z","lastTransitionTime":"2025-10-02T09:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.246200 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.246268 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.246279 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.246306 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.246336 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:37Z","lastTransitionTime":"2025-10-02T09:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.349170 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.349232 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.349245 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.349268 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.349281 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:37Z","lastTransitionTime":"2025-10-02T09:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.452803 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.453023 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.453047 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.453071 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.453083 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:37Z","lastTransitionTime":"2025-10-02T09:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.555934 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.556002 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.556014 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.556034 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.556047 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:37Z","lastTransitionTime":"2025-10-02T09:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.659646 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.659682 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.659691 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.659705 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.659714 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:37Z","lastTransitionTime":"2025-10-02T09:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.761710 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.762113 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.762152 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.762172 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.762185 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:37Z","lastTransitionTime":"2025-10-02T09:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.864809 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.864858 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.864874 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.864892 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.864905 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:37Z","lastTransitionTime":"2025-10-02T09:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.967995 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.968033 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.968042 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.968056 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:37 crc kubenswrapper[4722]: I1002 09:33:37.968066 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:37Z","lastTransitionTime":"2025-10-02T09:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.070337 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.070365 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.070373 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.070385 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.070393 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:38Z","lastTransitionTime":"2025-10-02T09:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.147705 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zwlns_04c84986-75a1-4683-8764-8f7307d1126a/kube-multus/0.log" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.147746 4722 generic.go:334] "Generic (PLEG): container finished" podID="04c84986-75a1-4683-8764-8f7307d1126a" containerID="460204a345893172e65dd3415be30a96dd48d7bce44d05971133cde3e43939af" exitCode=1 Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.147778 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zwlns" event={"ID":"04c84986-75a1-4683-8764-8f7307d1126a","Type":"ContainerDied","Data":"460204a345893172e65dd3415be30a96dd48d7bce44d05971133cde3e43939af"} Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.148174 4722 scope.go:117] "RemoveContainer" containerID="460204a345893172e65dd3415be30a96dd48d7bce44d05971133cde3e43939af" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.172277 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8589e9ec-e8ad-4b32-8846-841b689971b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf821bcfda8c6b7f9823ecb4a9dc7844eae7b1005d41e7bfe9b47df5d66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664f4e51f677aedd16cbcaed322e25d228c87936e30996f36dd3a5b97a43def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14f04556a2e11720ac3e92ec6f1c436468001e87b2984d007f9510a0e6ff0c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546efffc2572c644b7c949abadbaa2f8c9cc0449e08bb5615d3de0504f1bb9e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:38Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.177679 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.177727 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.177739 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.177761 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.177780 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:38Z","lastTransitionTime":"2025-10-02T09:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.188237 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://562dcc33d7454113e3dec145a8b2faba06d442892ac65301a5713c82e3ca73e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:38Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.201308 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sjg5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc362a8f-ced8-4af3-b908-f856e59845fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece7756234b7c8bc4d84fe1c8e1ed2c722cdcca266f4cd32d29cd5ff0df57b22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61c0fa498ca64b09ac096d8d7c72a5e2e6e95e2297e646ef2c6fd5bf0a0e60fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:33:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sjg5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:38Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.221432 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jn9jw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d43b03f-1f3f-4824-9eda-93e405f6ee9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgg5w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgg5w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jn9jw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:38Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.234629 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:38Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.247481 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:38Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.266296 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d680497dd4585ef08ee0f4b4ee45281ecdda251e7331302b9df8f5e4751727c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d680497dd4585ef08ee0f4b4ee45281ecdda251e7331302b9df8f5e4751727c1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T09:33:15Z\\\",\\\"message\\\":\\\"uilt lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-service-ca-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.40\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 09:33:15.499400 6373 services_controller.go:356] Processing sync for service openshift-apiserver-operator/metrics for network=default\\\\nI1002 09:33:15.498907 6373 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-jlvpl in node crc\\\\nI1002 09:33:15.499439 6373 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-jlvpl after 0 failed attempt(s)\\\\nI1002 09:33:15.499444 6373 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-jlvpl\\\\nF1002 09:33:15.498925 6373 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:33:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jlvpl_openshift-ovn-kubernetes(a900403a-9c51-4ca5-862f-2fbbd30cbcc3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jlvpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:38Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.280415 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3945e65-623a-40eb-82ae-8b17bcf8827b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497de5e5e05ba246ef4a053b62e09441e5e43342b0a61f446d7dabe2066b40b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cc2g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:38Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.281192 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.281246 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.281254 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.281268 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.281277 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:38Z","lastTransitionTime":"2025-10-02T09:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.292548 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f662826c-e772-4f56-a85f-83971aaef356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b262eb3201301d5e7cc940314a920dd22bcac685c0d91f412f27647028c7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc66c4529a0875a210fbc2b5af96db36ab208d651068958075de46898c1e150\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q6h76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:38Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.303425 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tsdt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb2b1d65-2472-468b-8995-eb1943a63a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e98c56ccc33119647035cca69beb3a8def018e7bb0cae463c6910e18e0f64a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tsdt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:38Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.314393 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1756cb2-0822-4b79-b527-99dad76e6b19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4650b87cfdd4d4075b3c3c3a7417d8ae6c01222820fa783143fcf963904a9215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247e21bc25ecb430b611fbc0d7260465b80365a76340c4d89652a0e0affdc39a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09276f774f0a48df9df5b99cce34366b7113c49a0c9e8e269996127d93c6015d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a875abefffed9954368ae8e08d3101ae90708e53f82e7eaa447d6164e41a446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a875abefffed9954368ae8e08d3101ae90708e53f82e7eaa447d6164e41a446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:38Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.323839 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dec5204-e2c5-43eb-8b80-80b74c9aa806\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beaa5e380cbd961e9e06c7aad0f3ff2f42c43eab5e89b6cf779003c490baf018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1be9f4d5574e381c95d2f7e44f20e96db3e586f66a380f79ab89422112bbfb35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1be9f4d5574e381c95d2f7e44f20e96db3e586f66a380f79ab89422112bbfb35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:38Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.334778 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://059f5c846149b9005d20124d666d2964bccada1c171cb57e17c64555147b0ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ee5e5f4f56ae11d8f714c42b1c1e6044b04fa5cf4c353d6bfd08c9f3b9e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:38Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.353662 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a27c33baca44df058fd006a35ff882b94f64446256b26a7bcde4c22671a5ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:38Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.368121 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:38Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.382692 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a56e3a0-fea1-4ca5-9d2a-b0a28e16d684\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://737b3f37862e455f181b6864cab54bb6dae5c458ed9dcb9b7fda63a06d0f17fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb6cb31ce28cc0d6095e7f27187bdeb1c2ee2d3f19ac480c95ab8116bd776bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01dd91984bea279379c038d2fbb4a4ad45200b49443071e290c0bf6854af1ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df1cccd0fca2c109ddcdd7a8dbbdeda8524b30c01769c733e4ac6ffb6fde621d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 09:32:46.368533 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 09:32:46.368793 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 09:32:46.369600 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1554788403/tls.crt::/tmp/serving-cert-1554788403/tls.key\\\\\\\"\\\\nI1002 09:32:46.682405 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 09:32:46.685049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 09:32:46.685073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 09:32:46.685132 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 09:32:46.685139 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 09:32:46.690261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 09:32:46.690286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 09:32:46.690299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 09:32:46.690303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 09:32:46.690306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 09:32:46.690478 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 09:32:46.692277 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3b1c06b43a4fb138588fe71ad3f37dae41d5de647684ab490a2f4079018ac06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:38Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.383414 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.383450 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.383463 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.383478 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.383489 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:38Z","lastTransitionTime":"2025-10-02T09:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.394342 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hm66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9384fcdadf3735b0de65edc5af7e3d88b55d03043291aea0bc8cc8de945e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7k768\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hm66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:38Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.407575 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zwlns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04c84986-75a1-4683-8764-8f7307d1126a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460204a345893172e65dd3415be30a96dd48d7bce44d05971133cde3e43939af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://460204a345893172e65dd3415be30a96dd48d7bce44d05971133cde3e43939af\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T09:33:37Z\\\",\\\"message\\\":\\\"2025-10-02T09:32:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6d22107e-3534-4c9e-95a9-9516dceffe8f\\\\n2025-10-02T09:32:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6d22107e-3534-4c9e-95a9-9516dceffe8f to /host/opt/cni/bin/\\\\n2025-10-02T09:32:52Z [verbose] multus-daemon started\\\\n2025-10-02T09:32:52Z [verbose] Readiness Indicator file check\\\\n2025-10-02T09:33:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghrmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zwlns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:38Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.485900 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.485927 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.485934 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.485947 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.485956 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:38Z","lastTransitionTime":"2025-10-02T09:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.588400 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.588428 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.588436 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.588451 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.588460 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:38Z","lastTransitionTime":"2025-10-02T09:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.690341 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.690372 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.690384 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.690400 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.690411 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:38Z","lastTransitionTime":"2025-10-02T09:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.711416 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:33:38 crc kubenswrapper[4722]: E1002 09:33:38.711520 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.711663 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:33:38 crc kubenswrapper[4722]: E1002 09:33:38.711708 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.711817 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:33:38 crc kubenswrapper[4722]: E1002 09:33:38.711867 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.711959 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:33:38 crc kubenswrapper[4722]: E1002 09:33:38.711999 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.792932 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.792965 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.792976 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.792994 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.793004 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:38Z","lastTransitionTime":"2025-10-02T09:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.895149 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.895182 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.895192 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.895207 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.895218 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:38Z","lastTransitionTime":"2025-10-02T09:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.998697 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.998737 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.998749 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.998765 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:38 crc kubenswrapper[4722]: I1002 09:33:38.998778 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:38Z","lastTransitionTime":"2025-10-02T09:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.101545 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.101593 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.101602 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.101615 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.101644 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:39Z","lastTransitionTime":"2025-10-02T09:33:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.159425 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zwlns_04c84986-75a1-4683-8764-8f7307d1126a/kube-multus/0.log" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.159489 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zwlns" event={"ID":"04c84986-75a1-4683-8764-8f7307d1126a","Type":"ContainerStarted","Data":"aae2e270e209af7e6afbf96e995421ccb4fd0aa09be5011bb0458307a8d5788a"} Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.175723 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tsdt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb2b1d65-2472-468b-8995-eb1943a63a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e98c56ccc33119647035cca69beb3a8def018e7bb0cae463c6910e18e0f64a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tsdt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:39Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.192606 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1756cb2-0822-4b79-b527-99dad76e6b19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4650b87cfdd4d4075b3c3c3a7417d8ae6c01222820fa783143fcf963904a9215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247e21bc25ecb430b611fbc0d7260465b80365a76340c4d89652a0e0affdc39a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09276f774f0a48df9df5b99cce34366b7113c49a0c9e8e269996127d93c6015d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a875abefffed9954368ae8e08d3101ae90708e53f82e7eaa447d6164e41a446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a875abefffed9954368ae8e08d3101ae90708e53f82e7eaa447d6164e41a446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:39Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.205290 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.205370 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.205380 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.205398 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.205411 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:39Z","lastTransitionTime":"2025-10-02T09:33:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.207626 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dec5204-e2c5-43eb-8b80-80b74c9aa806\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beaa5e380cbd961e9e06c7aad0f3ff2f42c43eab5e89b6cf779003c490baf018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1be9f4d5574e381c95d2f7e44f20e96db3e586f66a380f79ab89422112bbfb35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1be9f4d5574e381c95d2f7e44f20e96db3e586f66a380f79ab89422112bbfb35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:39Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.229043 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://059f5c846149b9005d20124d666d2964bccada1c171cb57e17c64555147b0ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ee5e5f4f56ae11d8f714c42b1c1e6044b04fa5cf4c353d6bfd08c9f3b9e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:39Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.244123 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a27c33baca44df058fd006a35ff882b94f64446256b26a7bcde4c22671a5ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:39Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.259284 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:39Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.271170 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f662826c-e772-4f56-a85f-83971aaef356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b262eb3201301d5e7cc940314a920dd22bcac685c0d91f412f27647028c7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc66c4529a0875a210fbc2b5af96db36ab208d651068958075de46898c1e150\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q6h76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:39Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.284987 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a56e3a0-fea1-4ca5-9d2a-b0a28e16d684\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://737b3f37862e455f181b6864cab54bb6dae5c458ed9dcb9b7fda63a06d0f17fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb6cb31ce28cc0d6095e7f27187bdeb1c2ee2d3f19ac480c95ab8116bd776bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01dd91984bea279379c038d2fbb4a4ad45200b49443071e290c0bf6854af1ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df1cccd0fca2c109ddcdd7a8dbbdeda8524b30c01769c733e4ac6ffb6fde621d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 09:32:46.368533 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 09:32:46.368793 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 09:32:46.369600 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1554788403/tls.crt::/tmp/serving-cert-1554788403/tls.key\\\\\\\"\\\\nI1002 09:32:46.682405 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 09:32:46.685049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 09:32:46.685073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 09:32:46.685132 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 09:32:46.685139 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 09:32:46.690261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 09:32:46.690286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 09:32:46.690299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 09:32:46.690303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 09:32:46.690306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 09:32:46.690478 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 09:32:46.692277 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3b1c06b43a4fb138588fe71ad3f37dae41d5de647684ab490a2f4079018ac06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:39Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.294467 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hm66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9384fcdadf3735b0de65edc5af7e3d88b55d03043291aea0bc8cc8de945e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7k768\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hm66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:39Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.307952 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zwlns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04c84986-75a1-4683-8764-8f7307d1126a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae2e270e209af7e6afbf96e995421ccb4fd0aa09be5011bb0458307a8d5788a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://460204a345893172e65dd3415be30a96dd48d7bce44d05971133cde3e43939af\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T09:33:37Z\\\",\\\"message\\\":\\\"2025-10-02T09:32:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6d22107e-3534-4c9e-95a9-9516dceffe8f\\\\n2025-10-02T09:32:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6d22107e-3534-4c9e-95a9-9516dceffe8f to /host/opt/cni/bin/\\\\n2025-10-02T09:32:52Z [verbose] multus-daemon started\\\\n2025-10-02T09:32:52Z [verbose] Readiness Indicator file check\\\\n2025-10-02T09:33:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghrmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zwlns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:39Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.308057 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.308094 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.308103 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.308119 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.308131 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:39Z","lastTransitionTime":"2025-10-02T09:33:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.322572 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8589e9ec-e8ad-4b32-8846-841b689971b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf821bcfda8c6b7f9823ecb4a9dc7844eae7b1005d41e7bfe9b47df5d66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664f4e51f677aedd16cbcaed322e25d228c87936e30996f36dd3a5b97a43def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14f04556a2e11720ac3e92ec6f1c436468001e87b2984d007f9510a0e6ff0c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546efffc2572c644b7c949abadbaa2f8c9cc0449e08bb5615d3de0504f1bb9e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:39Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.335975 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://562dcc33d7454113e3dec145a8b2faba06d442892ac65301a5713c82e3ca73e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:39Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.350911 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sjg5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc362a8f-ced8-4af3-b908-f856e59845fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece7756234b7c8bc4d84fe1c8e1ed2c722cdcca266f4cd32d29cd5ff0df57b22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61c0fa498ca64b09ac096d8d7c72a5e2e6e95e2297e646ef2c6fd5bf0a0e60fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:33:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sjg5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:39Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.365978 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jn9jw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d43b03f-1f3f-4824-9eda-93e405f6ee9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgg5w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgg5w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jn9jw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:39Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.380068 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:39Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.393457 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:39Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.410658 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d680497dd4585ef08ee0f4b4ee45281ecdda251e7331302b9df8f5e4751727c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d680497dd4585ef08ee0f4b4ee45281ecdda251e7331302b9df8f5e4751727c1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T09:33:15Z\\\",\\\"message\\\":\\\"uilt lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-service-ca-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.40\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 09:33:15.499400 6373 services_controller.go:356] Processing sync for service openshift-apiserver-operator/metrics for network=default\\\\nI1002 09:33:15.498907 6373 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-jlvpl in node crc\\\\nI1002 09:33:15.499439 6373 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-jlvpl after 0 failed attempt(s)\\\\nI1002 09:33:15.499444 6373 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-jlvpl\\\\nF1002 09:33:15.498925 6373 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:33:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jlvpl_openshift-ovn-kubernetes(a900403a-9c51-4ca5-862f-2fbbd30cbcc3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jlvpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:39Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.411612 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.411659 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.411669 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.411682 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.411693 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:39Z","lastTransitionTime":"2025-10-02T09:33:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.423599 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3945e65-623a-40eb-82ae-8b17bcf8827b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497de5e5e05ba246ef4a053b62e09441e5e43342b0a61f446d7dabe2066b40b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cc2g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:39Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.514193 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.514228 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.514236 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.514251 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.514261 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:39Z","lastTransitionTime":"2025-10-02T09:33:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.616938 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.616979 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.616992 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.617007 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.617022 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:39Z","lastTransitionTime":"2025-10-02T09:33:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.718621 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.718656 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.718666 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.718681 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.718690 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:39Z","lastTransitionTime":"2025-10-02T09:33:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.821268 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.821308 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.821339 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.821361 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.821376 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:39Z","lastTransitionTime":"2025-10-02T09:33:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.923820 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.923870 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.923878 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.923894 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:39 crc kubenswrapper[4722]: I1002 09:33:39.923904 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:39Z","lastTransitionTime":"2025-10-02T09:33:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.025859 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.025889 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.025897 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.025911 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.025919 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:40Z","lastTransitionTime":"2025-10-02T09:33:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.127840 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.127883 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.127894 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.127911 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.127921 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:40Z","lastTransitionTime":"2025-10-02T09:33:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.230365 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.230406 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.230414 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.230428 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.230438 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:40Z","lastTransitionTime":"2025-10-02T09:33:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.332875 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.332920 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.332929 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.332949 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.332959 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:40Z","lastTransitionTime":"2025-10-02T09:33:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.435280 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.435353 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.435364 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.435381 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.435398 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:40Z","lastTransitionTime":"2025-10-02T09:33:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.538598 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.538640 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.538654 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.538671 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.538685 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:40Z","lastTransitionTime":"2025-10-02T09:33:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.641876 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.641916 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.641926 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.641944 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.641954 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:40Z","lastTransitionTime":"2025-10-02T09:33:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.710221 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.710289 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.710354 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:33:40 crc kubenswrapper[4722]: E1002 09:33:40.710384 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.710386 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:33:40 crc kubenswrapper[4722]: E1002 09:33:40.710455 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:33:40 crc kubenswrapper[4722]: E1002 09:33:40.710546 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:33:40 crc kubenswrapper[4722]: E1002 09:33:40.710808 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.744513 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.744557 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.744565 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.744579 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.744589 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:40Z","lastTransitionTime":"2025-10-02T09:33:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.846768 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.846819 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.846832 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.846851 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.846918 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:40Z","lastTransitionTime":"2025-10-02T09:33:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.951997 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.952062 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.952075 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.952089 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:40 crc kubenswrapper[4722]: I1002 09:33:40.952098 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:40Z","lastTransitionTime":"2025-10-02T09:33:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.055507 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.055559 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.055570 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.055588 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.055600 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:41Z","lastTransitionTime":"2025-10-02T09:33:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.158169 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.158203 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.158213 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.158228 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.158238 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:41Z","lastTransitionTime":"2025-10-02T09:33:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.261204 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.261273 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.261284 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.261305 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.261339 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:41Z","lastTransitionTime":"2025-10-02T09:33:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.364485 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.364542 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.364555 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.364582 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.364605 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:41Z","lastTransitionTime":"2025-10-02T09:33:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.467328 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.467378 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.467387 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.467411 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.467423 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:41Z","lastTransitionTime":"2025-10-02T09:33:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.570541 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.570581 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.570590 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.570607 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.570618 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:41Z","lastTransitionTime":"2025-10-02T09:33:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.673604 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.673665 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.673677 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.673692 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.673703 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:41Z","lastTransitionTime":"2025-10-02T09:33:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.709826 4722 scope.go:117] "RemoveContainer" containerID="d680497dd4585ef08ee0f4b4ee45281ecdda251e7331302b9df8f5e4751727c1" Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.776620 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.776681 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.776693 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.776716 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.776734 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:41Z","lastTransitionTime":"2025-10-02T09:33:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.880254 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.880305 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.880342 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.880359 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.880369 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:41Z","lastTransitionTime":"2025-10-02T09:33:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.982266 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.982295 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.982303 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.982362 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:41 crc kubenswrapper[4722]: I1002 09:33:41.982379 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:41Z","lastTransitionTime":"2025-10-02T09:33:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.085865 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.085916 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.085926 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.085942 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.085962 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:42Z","lastTransitionTime":"2025-10-02T09:33:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.167609 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jlvpl_a900403a-9c51-4ca5-862f-2fbbd30cbcc3/ovnkube-controller/2.log" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.169822 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" event={"ID":"a900403a-9c51-4ca5-862f-2fbbd30cbcc3","Type":"ContainerStarted","Data":"523bd1dd360fb1d2b7a23339cf39f643b7dbd12a237ec876e659b1a174797d90"} Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.170496 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.185624 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hm66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9384fcdadf3735b0de65edc5af7e3d88b55d03043291aea0bc8cc8de945e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7k768\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hm66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:42Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.189169 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.189214 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.189225 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.189240 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.189251 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:42Z","lastTransitionTime":"2025-10-02T09:33:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.201122 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zwlns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04c84986-75a1-4683-8764-8f7307d1126a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae2e270e209af7e6afbf96e995421ccb4fd0aa09be5011bb0458307a8d5788a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://460204a345893172e65dd3415be30a96dd48d7bce44d05971133cde3e43939af\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T09:33:37Z\\\",\\\"message\\\":\\\"2025-10-02T09:32:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6d22107e-3534-4c9e-95a9-9516dceffe8f\\\\n2025-10-02T09:32:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6d22107e-3534-4c9e-95a9-9516dceffe8f to /host/opt/cni/bin/\\\\n2025-10-02T09:32:52Z [verbose] multus-daemon started\\\\n2025-10-02T09:32:52Z [verbose] Readiness Indicator file check\\\\n2025-10-02T09:33:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghrmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zwlns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:42Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.219498 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a56e3a0-fea1-4ca5-9d2a-b0a28e16d684\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://737b3f37862e455f181b6864cab54bb6dae5c458ed9dcb9b7fda63a06d0f17fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb6cb31ce28cc0d6095e7f27187bdeb1c2ee2d3f19ac480c95ab8116bd776bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01dd91984bea279379c038d2fbb4a4ad45200b49443071e290c0bf6854af1ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df1cccd0fca2c109ddcdd7a8dbbdeda8524b30c01769c733e4ac6ffb6fde621d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 09:32:46.368533 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 09:32:46.368793 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 09:32:46.369600 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1554788403/tls.crt::/tmp/serving-cert-1554788403/tls.key\\\\\\\"\\\\nI1002 09:32:46.682405 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 09:32:46.685049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 09:32:46.685073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 09:32:46.685132 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 09:32:46.685139 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 09:32:46.690261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 09:32:46.690286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 09:32:46.690299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 09:32:46.690303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 09:32:46.690306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 09:32:46.690478 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 09:32:46.692277 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3b1c06b43a4fb138588fe71ad3f37dae41d5de647684ab490a2f4079018ac06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:42Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.237198 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://562dcc33d7454113e3dec145a8b2faba06d442892ac65301a5713c82e3ca73e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:42Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.251789 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sjg5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc362a8f-ced8-4af3-b908-f856e59845fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece7756234b7c8bc4d84fe1c8e1ed2c722cdcca266f4cd32d29cd5ff0df57b22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61c0fa498ca64b09ac096d8d7c72a5e2e6e95e2297e646ef2c6fd5bf0a0e60fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:33:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sjg5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:42Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.266470 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jn9jw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d43b03f-1f3f-4824-9eda-93e405f6ee9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgg5w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgg5w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jn9jw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:42Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.281756 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8589e9ec-e8ad-4b32-8846-841b689971b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf821bcfda8c6b7f9823ecb4a9dc7844eae7b1005d41e7bfe9b47df5d66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664f4e51f677aedd16cbcaed322e25d228c87936e30996f36dd3a5b97a43def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14f04556a2e11720ac3e92ec6f1c436468001e87b2984d007f9510a0e6ff0c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546efffc2572c644b7c949abadbaa2f8c9cc0449e08bb5615d3de0504f1bb9e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:42Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.291089 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.291138 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.291148 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.291164 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.291174 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:42Z","lastTransitionTime":"2025-10-02T09:33:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.298945 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:42Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.314461 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:42Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.342619 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://523bd1dd360fb1d2b7a23339cf39f643b7dbd12a237ec876e659b1a174797d90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d680497dd4585ef08ee0f4b4ee45281ecdda251e7331302b9df8f5e4751727c1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T09:33:15Z\\\",\\\"message\\\":\\\"uilt lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-service-ca-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.40\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 09:33:15.499400 6373 services_controller.go:356] Processing sync for service openshift-apiserver-operator/metrics for network=default\\\\nI1002 09:33:15.498907 6373 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-jlvpl in node crc\\\\nI1002 09:33:15.499439 6373 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-jlvpl after 0 failed attempt(s)\\\\nI1002 09:33:15.499444 6373 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-jlvpl\\\\nF1002 09:33:15.498925 6373 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:33:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jlvpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:42Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.367447 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3945e65-623a-40eb-82ae-8b17bcf8827b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497de5e5e05ba246ef4a053b62e09441e5e43342b0a61f446d7dabe2066b40b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cc2g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:42Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.385123 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dec5204-e2c5-43eb-8b80-80b74c9aa806\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beaa5e380cbd961e9e06c7aad0f3ff2f42c43eab5e89b6cf779003c490baf018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1be9f4d5574e381c95d2f7e44f20e96db3e586f66a380f79ab89422112bbfb35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1be9f4d5574e381c95d2f7e44f20e96db3e586f66a380f79ab89422112bbfb35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:42Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.393470 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.393511 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.393521 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.393535 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.393549 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:42Z","lastTransitionTime":"2025-10-02T09:33:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.403465 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://059f5c846149b9005d20124d666d2964bccada1c171cb57e17c64555147b0ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ee5e5f4f56ae11d8f714c42b1c1e6044b04fa5cf4c353d6bfd08c9f3b9e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:42Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.415262 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a27c33baca44df058fd006a35ff882b94f64446256b26a7bcde4c22671a5ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:42Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.428215 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:42Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.442063 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f662826c-e772-4f56-a85f-83971aaef356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b262eb3201301d5e7cc940314a920dd22bcac685c0d91f412f27647028c7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc66c4529a0875a210fbc2b5af96db36ab208d651068958075de46898c1e150\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q6h76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:42Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.455767 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tsdt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb2b1d65-2472-468b-8995-eb1943a63a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e98c56ccc33119647035cca69beb3a8def018e7bb0cae463c6910e18e0f64a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tsdt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:42Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.471209 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1756cb2-0822-4b79-b527-99dad76e6b19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4650b87cfdd4d4075b3c3c3a7417d8ae6c01222820fa783143fcf963904a9215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247e21bc25ecb430b611fbc0d7260465b80365a76340c4d89652a0e0affdc39a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09276f774f0a48df9df5b99cce34366b7113c49a0c9e8e269996127d93c6015d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a875abefffed9954368ae8e08d3101ae90708e53f82e7eaa447d6164e41a446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a875abefffed9954368ae8e08d3101ae90708e53f82e7eaa447d6164e41a446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:42Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.496701 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.496744 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.496768 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.496785 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.496796 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:42Z","lastTransitionTime":"2025-10-02T09:33:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.600397 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.600443 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.600456 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.600473 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.600483 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:42Z","lastTransitionTime":"2025-10-02T09:33:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.703650 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.703705 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.703718 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.703738 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.703751 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:42Z","lastTransitionTime":"2025-10-02T09:33:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.710013 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.710130 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.710027 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:33:42 crc kubenswrapper[4722]: E1002 09:33:42.710183 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.710298 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:33:42 crc kubenswrapper[4722]: E1002 09:33:42.710442 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:33:42 crc kubenswrapper[4722]: E1002 09:33:42.710537 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:33:42 crc kubenswrapper[4722]: E1002 09:33:42.710848 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.806205 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.806244 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.806253 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.806268 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.806278 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:42Z","lastTransitionTime":"2025-10-02T09:33:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.908404 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.908448 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.908459 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.908475 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:42 crc kubenswrapper[4722]: I1002 09:33:42.908484 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:42Z","lastTransitionTime":"2025-10-02T09:33:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.011050 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.011123 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.011133 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.011148 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.011159 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:43Z","lastTransitionTime":"2025-10-02T09:33:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.114974 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.115006 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.115015 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.115030 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.115041 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:43Z","lastTransitionTime":"2025-10-02T09:33:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.175409 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jlvpl_a900403a-9c51-4ca5-862f-2fbbd30cbcc3/ovnkube-controller/3.log" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.176008 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jlvpl_a900403a-9c51-4ca5-862f-2fbbd30cbcc3/ovnkube-controller/2.log" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.179238 4722 generic.go:334] "Generic (PLEG): container finished" podID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerID="523bd1dd360fb1d2b7a23339cf39f643b7dbd12a237ec876e659b1a174797d90" exitCode=1 Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.179285 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" event={"ID":"a900403a-9c51-4ca5-862f-2fbbd30cbcc3","Type":"ContainerDied","Data":"523bd1dd360fb1d2b7a23339cf39f643b7dbd12a237ec876e659b1a174797d90"} Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.179348 4722 scope.go:117] "RemoveContainer" containerID="d680497dd4585ef08ee0f4b4ee45281ecdda251e7331302b9df8f5e4751727c1" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.179999 4722 scope.go:117] "RemoveContainer" containerID="523bd1dd360fb1d2b7a23339cf39f643b7dbd12a237ec876e659b1a174797d90" Oct 02 09:33:43 crc kubenswrapper[4722]: E1002 09:33:43.180185 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jlvpl_openshift-ovn-kubernetes(a900403a-9c51-4ca5-862f-2fbbd30cbcc3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.194309 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f662826c-e772-4f56-a85f-83971aaef356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b262eb3201301d5e7cc940314a920dd22bcac685c0d91f412f27647028c7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc66c4529a0875a210fbc2b5af96db36ab208d651068958075de46898c1e150\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q6h76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:43Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.207733 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tsdt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb2b1d65-2472-468b-8995-eb1943a63a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e98c56ccc33119647035cca69beb3a8def018e7bb0cae463c6910e18e0f64a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tsdt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:43Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.217852 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.217907 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.217922 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.217942 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.217959 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:43Z","lastTransitionTime":"2025-10-02T09:33:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.228801 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1756cb2-0822-4b79-b527-99dad76e6b19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4650b87cfdd4d4075b3c3c3a7417d8ae6c01222820fa783143fcf963904a9215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247e21bc25ecb430b611fbc0d7260465b80365a76340c4d89652a0e0affdc39a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09276f774f0a48df9df5b99cce34366b7113c49a0c9e8e269996127d93c6015d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a875abefffed9954368ae8e08d3101ae90708e53f82e7eaa447d6164e41a446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a875abefffed9954368ae8e08d3101ae90708e53f82e7eaa447d6164e41a446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:43Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.242989 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dec5204-e2c5-43eb-8b80-80b74c9aa806\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beaa5e380cbd961e9e06c7aad0f3ff2f42c43eab5e89b6cf779003c490baf018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1be9f4d5574e381c95d2f7e44f20e96db3e586f66a380f79ab89422112bbfb35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1be9f4d5574e381c95d2f7e44f20e96db3e586f66a380f79ab89422112bbfb35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:43Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.258281 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://059f5c846149b9005d20124d666d2964bccada1c171cb57e17c64555147b0ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ee5e5f4f56ae11d8f714c42b1c1e6044b04fa5cf4c353d6bfd08c9f3b9e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:43Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.276072 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a27c33baca44df058fd006a35ff882b94f64446256b26a7bcde4c22671a5ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:43Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.291784 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:43Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.305930 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a56e3a0-fea1-4ca5-9d2a-b0a28e16d684\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://737b3f37862e455f181b6864cab54bb6dae5c458ed9dcb9b7fda63a06d0f17fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb6cb31ce28cc0d6095e7f27187bdeb1c2ee2d3f19ac480c95ab8116bd776bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01dd91984bea279379c038d2fbb4a4ad45200b49443071e290c0bf6854af1ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df1cccd0fca2c109ddcdd7a8dbbdeda8524b30c01769c733e4ac6ffb6fde621d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 09:32:46.368533 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 09:32:46.368793 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 09:32:46.369600 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1554788403/tls.crt::/tmp/serving-cert-1554788403/tls.key\\\\\\\"\\\\nI1002 09:32:46.682405 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 09:32:46.685049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 09:32:46.685073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 09:32:46.685132 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 09:32:46.685139 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 09:32:46.690261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 09:32:46.690286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 09:32:46.690299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 09:32:46.690303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 09:32:46.690306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 09:32:46.690478 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 09:32:46.692277 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3b1c06b43a4fb138588fe71ad3f37dae41d5de647684ab490a2f4079018ac06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:43Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.317854 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hm66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9384fcdadf3735b0de65edc5af7e3d88b55d03043291aea0bc8cc8de945e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7k768\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hm66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:43Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.319865 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.319900 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.319911 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.319929 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.319941 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:43Z","lastTransitionTime":"2025-10-02T09:33:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.332841 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zwlns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04c84986-75a1-4683-8764-8f7307d1126a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae2e270e209af7e6afbf96e995421ccb4fd0aa09be5011bb0458307a8d5788a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://460204a345893172e65dd3415be30a96dd48d7bce44d05971133cde3e43939af\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T09:33:37Z\\\",\\\"message\\\":\\\"2025-10-02T09:32:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6d22107e-3534-4c9e-95a9-9516dceffe8f\\\\n2025-10-02T09:32:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6d22107e-3534-4c9e-95a9-9516dceffe8f to /host/opt/cni/bin/\\\\n2025-10-02T09:32:52Z [verbose] multus-daemon started\\\\n2025-10-02T09:32:52Z [verbose] Readiness Indicator file check\\\\n2025-10-02T09:33:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghrmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zwlns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:43Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.352123 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8589e9ec-e8ad-4b32-8846-841b689971b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf821bcfda8c6b7f9823ecb4a9dc7844eae7b1005d41e7bfe9b47df5d66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664f4e51f677aedd16cbcaed322e25d228c87936e30996f36dd3a5b97a43def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14f04556a2e11720ac3e92ec6f1c436468001e87b2984d007f9510a0e6ff0c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546efffc2572c644b7c949abadbaa2f8c9cc0449e08bb5615d3de0504f1bb9e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:43Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.370234 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://562dcc33d7454113e3dec145a8b2faba06d442892ac65301a5713c82e3ca73e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:43Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.382836 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sjg5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc362a8f-ced8-4af3-b908-f856e59845fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece7756234b7c8bc4d84fe1c8e1ed2c722cdcca266f4cd32d29cd5ff0df57b22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61c0fa498ca64b09ac096d8d7c72a5e2e6e95e2297e646ef2c6fd5bf0a0e60fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:33:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sjg5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:43Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.394772 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jn9jw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d43b03f-1f3f-4824-9eda-93e405f6ee9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgg5w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgg5w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jn9jw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:43Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.408787 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:43Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.421721 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:43Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.422461 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.422515 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.422525 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.422544 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.422554 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:43Z","lastTransitionTime":"2025-10-02T09:33:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.443299 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://523bd1dd360fb1d2b7a23339cf39f643b7dbd12a237ec876e659b1a174797d90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d680497dd4585ef08ee0f4b4ee45281ecdda251e7331302b9df8f5e4751727c1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T09:33:15Z\\\",\\\"message\\\":\\\"uilt lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-service-ca-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.40\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 09:33:15.499400 6373 services_controller.go:356] Processing sync for service openshift-apiserver-operator/metrics for network=default\\\\nI1002 09:33:15.498907 6373 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-jlvpl in node crc\\\\nI1002 09:33:15.499439 6373 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-jlvpl after 0 failed attempt(s)\\\\nI1002 09:33:15.499444 6373 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-jlvpl\\\\nF1002 09:33:15.498925 6373 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:33:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://523bd1dd360fb1d2b7a23339cf39f643b7dbd12a237ec876e659b1a174797d90\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T09:33:42Z\\\",\\\"message\\\":\\\"troller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 09:33:42.672774 6729 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 09:33:42.672857 6729 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 09:33:42.672929 6729 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 09:33:42.673214 6729 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 09:33:42.673850 6729 factory.go:656] Stopping watch factory\\\\nI1002 09:33:42.714471 6729 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1002 09:33:42.714503 6729 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1002 09:33:42.714596 6729 ovnkube.go:599] Stopped ovnkube\\\\nI1002 09:33:42.714617 6729 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1002 09:33:42.714696 6729 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:33:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jlvpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:43Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.464229 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3945e65-623a-40eb-82ae-8b17bcf8827b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497de5e5e05ba246ef4a053b62e09441e5e43342b0a61f446d7dabe2066b40b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cc2g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:43Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.526021 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.526079 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.526094 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.526118 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.526134 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:43Z","lastTransitionTime":"2025-10-02T09:33:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.629769 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.629853 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.629894 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.629933 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.629960 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:43Z","lastTransitionTime":"2025-10-02T09:33:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.732521 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.732575 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.732588 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.732613 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.732625 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:43Z","lastTransitionTime":"2025-10-02T09:33:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.778722 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.778772 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.778781 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.778799 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.778811 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:43Z","lastTransitionTime":"2025-10-02T09:33:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:43 crc kubenswrapper[4722]: E1002 09:33:43.794586 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"680c7bd3-d5b0-4b5c-92c1-2fb03a628b44\\\",\\\"systemUUID\\\":\\\"8b109f7e-b539-4abd-be49-fd9e033b3339\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:43Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.798837 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.798908 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.798924 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.798958 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.798979 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:43Z","lastTransitionTime":"2025-10-02T09:33:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:43 crc kubenswrapper[4722]: E1002 09:33:43.815652 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"680c7bd3-d5b0-4b5c-92c1-2fb03a628b44\\\",\\\"systemUUID\\\":\\\"8b109f7e-b539-4abd-be49-fd9e033b3339\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:43Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.826485 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.826577 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.826600 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.826648 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.826666 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:43Z","lastTransitionTime":"2025-10-02T09:33:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:43 crc kubenswrapper[4722]: E1002 09:33:43.841246 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"680c7bd3-d5b0-4b5c-92c1-2fb03a628b44\\\",\\\"systemUUID\\\":\\\"8b109f7e-b539-4abd-be49-fd9e033b3339\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:43Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.846346 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.846397 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.846409 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.846429 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.846443 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:43Z","lastTransitionTime":"2025-10-02T09:33:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:43 crc kubenswrapper[4722]: E1002 09:33:43.863069 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"680c7bd3-d5b0-4b5c-92c1-2fb03a628b44\\\",\\\"systemUUID\\\":\\\"8b109f7e-b539-4abd-be49-fd9e033b3339\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:43Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.866983 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.867022 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.867034 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.867094 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.867110 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:43Z","lastTransitionTime":"2025-10-02T09:33:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:43 crc kubenswrapper[4722]: E1002 09:33:43.879421 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"680c7bd3-d5b0-4b5c-92c1-2fb03a628b44\\\",\\\"systemUUID\\\":\\\"8b109f7e-b539-4abd-be49-fd9e033b3339\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:43Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:43 crc kubenswrapper[4722]: E1002 09:33:43.879552 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.881727 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.881757 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.881766 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.881785 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.881800 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:43Z","lastTransitionTime":"2025-10-02T09:33:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.984603 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.984651 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.984661 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.984680 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:43 crc kubenswrapper[4722]: I1002 09:33:43.984690 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:43Z","lastTransitionTime":"2025-10-02T09:33:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.087662 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.087706 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.087718 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.087734 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.087744 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:44Z","lastTransitionTime":"2025-10-02T09:33:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.183536 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jlvpl_a900403a-9c51-4ca5-862f-2fbbd30cbcc3/ovnkube-controller/3.log" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.186634 4722 scope.go:117] "RemoveContainer" containerID="523bd1dd360fb1d2b7a23339cf39f643b7dbd12a237ec876e659b1a174797d90" Oct 02 09:33:44 crc kubenswrapper[4722]: E1002 09:33:44.186764 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jlvpl_openshift-ovn-kubernetes(a900403a-9c51-4ca5-862f-2fbbd30cbcc3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.190096 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.190120 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.190128 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.190140 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.190149 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:44Z","lastTransitionTime":"2025-10-02T09:33:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.202952 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a56e3a0-fea1-4ca5-9d2a-b0a28e16d684\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://737b3f37862e455f181b6864cab54bb6dae5c458ed9dcb9b7fda63a06d0f17fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb6cb31ce28cc0d6095e7f27187bdeb1c2ee2d3f19ac480c95ab8116bd776bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01dd91984bea279379c038d2fbb4a4ad45200b49443071e290c0bf6854af1ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df1cccd0fca2c109ddcdd7a8dbbdeda8524b30c01769c733e4ac6ffb6fde621d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 09:32:46.368533 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 09:32:46.368793 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 09:32:46.369600 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1554788403/tls.crt::/tmp/serving-cert-1554788403/tls.key\\\\\\\"\\\\nI1002 09:32:46.682405 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 09:32:46.685049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 09:32:46.685073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 09:32:46.685132 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 09:32:46.685139 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 09:32:46.690261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 09:32:46.690286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 09:32:46.690299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 09:32:46.690303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 09:32:46.690306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 09:32:46.690478 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 09:32:46.692277 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3b1c06b43a4fb138588fe71ad3f37dae41d5de647684ab490a2f4079018ac06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:44Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.217700 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hm66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9384fcdadf3735b0de65edc5af7e3d88b55d03043291aea0bc8cc8de945e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7k768\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hm66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:44Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.234093 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zwlns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04c84986-75a1-4683-8764-8f7307d1126a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae2e270e209af7e6afbf96e995421ccb4fd0aa09be5011bb0458307a8d5788a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://460204a345893172e65dd3415be30a96dd48d7bce44d05971133cde3e43939af\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T09:33:37Z\\\",\\\"message\\\":\\\"2025-10-02T09:32:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6d22107e-3534-4c9e-95a9-9516dceffe8f\\\\n2025-10-02T09:32:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6d22107e-3534-4c9e-95a9-9516dceffe8f to /host/opt/cni/bin/\\\\n2025-10-02T09:32:52Z [verbose] multus-daemon started\\\\n2025-10-02T09:32:52Z [verbose] Readiness Indicator file check\\\\n2025-10-02T09:33:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghrmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zwlns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:44Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.257726 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8589e9ec-e8ad-4b32-8846-841b689971b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf821bcfda8c6b7f9823ecb4a9dc7844eae7b1005d41e7bfe9b47df5d66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664f4e51f677aedd16cbcaed322e25d228c87936e30996f36dd3a5b97a43def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14f04556a2e11720ac3e92ec6f1c436468001e87b2984d007f9510a0e6ff0c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546efffc2572c644b7c949abadbaa2f8c9cc0449e08bb5615d3de0504f1bb9e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:44Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.273924 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://562dcc33d7454113e3dec145a8b2faba06d442892ac65301a5713c82e3ca73e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:44Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.289588 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sjg5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc362a8f-ced8-4af3-b908-f856e59845fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece7756234b7c8bc4d84fe1c8e1ed2c722cdcca266f4cd32d29cd5ff0df57b22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61c0fa498ca64b09ac096d8d7c72a5e2e6e95e2297e646ef2c6fd5bf0a0e60fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:33:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sjg5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:44Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.292604 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.292686 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.292712 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.292782 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.292807 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:44Z","lastTransitionTime":"2025-10-02T09:33:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.305598 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jn9jw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d43b03f-1f3f-4824-9eda-93e405f6ee9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgg5w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgg5w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jn9jw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:44Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.321095 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:44Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.336993 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:44Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.358138 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://523bd1dd360fb1d2b7a23339cf39f643b7dbd12a237ec876e659b1a174797d90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://523bd1dd360fb1d2b7a23339cf39f643b7dbd12a237ec876e659b1a174797d90\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T09:33:42Z\\\",\\\"message\\\":\\\"troller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 09:33:42.672774 6729 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 09:33:42.672857 6729 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 09:33:42.672929 6729 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 09:33:42.673214 6729 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 09:33:42.673850 6729 factory.go:656] Stopping watch factory\\\\nI1002 09:33:42.714471 6729 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1002 09:33:42.714503 6729 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1002 09:33:42.714596 6729 ovnkube.go:599] Stopped ovnkube\\\\nI1002 09:33:42.714617 6729 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1002 09:33:42.714696 6729 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:33:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jlvpl_openshift-ovn-kubernetes(a900403a-9c51-4ca5-862f-2fbbd30cbcc3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jlvpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:44Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.374462 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3945e65-623a-40eb-82ae-8b17bcf8827b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497de5e5e05ba246ef4a053b62e09441e5e43342b0a61f446d7dabe2066b40b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cc2g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:44Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.387924 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1756cb2-0822-4b79-b527-99dad76e6b19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4650b87cfdd4d4075b3c3c3a7417d8ae6c01222820fa783143fcf963904a9215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247e21bc25ecb430b611fbc0d7260465b80365a76340c4d89652a0e0affdc39a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09276f774f0a48df9df5b99cce34366b7113c49a0c9e8e269996127d93c6015d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a875abefffed9954368ae8e08d3101ae90708e53f82e7eaa447d6164e41a446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a875abefffed9954368ae8e08d3101ae90708e53f82e7eaa447d6164e41a446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:44Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.395991 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.396022 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.396033 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.396049 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.396060 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:44Z","lastTransitionTime":"2025-10-02T09:33:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.400292 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dec5204-e2c5-43eb-8b80-80b74c9aa806\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beaa5e380cbd961e9e06c7aad0f3ff2f42c43eab5e89b6cf779003c490baf018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1be9f4d5574e381c95d2f7e44f20e96db3e586f66a380f79ab89422112bbfb35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1be9f4d5574e381c95d2f7e44f20e96db3e586f66a380f79ab89422112bbfb35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:44Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.414724 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://059f5c846149b9005d20124d666d2964bccada1c171cb57e17c64555147b0ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ee5e5f4f56ae11d8f714c42b1c1e6044b04fa5cf4c353d6bfd08c9f3b9e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:44Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.430357 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a27c33baca44df058fd006a35ff882b94f64446256b26a7bcde4c22671a5ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:44Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.445185 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:44Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.459338 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f662826c-e772-4f56-a85f-83971aaef356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b262eb3201301d5e7cc940314a920dd22bcac685c0d91f412f27647028c7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc66c4529a0875a210fbc2b5af96db36ab208d651068958075de46898c1e150\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q6h76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:44Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.473605 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tsdt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb2b1d65-2472-468b-8995-eb1943a63a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e98c56ccc33119647035cca69beb3a8def018e7bb0cae463c6910e18e0f64a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tsdt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:44Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.500834 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.500932 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.500962 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.500999 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.501025 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:44Z","lastTransitionTime":"2025-10-02T09:33:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.603969 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.604042 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.604183 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.604229 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.604240 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:44Z","lastTransitionTime":"2025-10-02T09:33:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.708123 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.708190 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.708203 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.708230 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.708248 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:44Z","lastTransitionTime":"2025-10-02T09:33:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.709392 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.709441 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.709575 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.709740 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:33:44 crc kubenswrapper[4722]: E1002 09:33:44.709747 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:33:44 crc kubenswrapper[4722]: E1002 09:33:44.709956 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:33:44 crc kubenswrapper[4722]: E1002 09:33:44.710029 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:33:44 crc kubenswrapper[4722]: E1002 09:33:44.710205 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.722969 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dec5204-e2c5-43eb-8b80-80b74c9aa806\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beaa5e380cbd961e9e06c7aad0f3ff2f42c43eab5e89b6cf779003c490baf018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1be9f4d5574e381c95d2f7e44f20e96db3e586f66a380f79ab89422112bbfb35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1be9f4d5574e381c95d2f7e44f20e96db3e586f66a380f79ab89422112bbfb35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:44Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.738234 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://059f5c846149b9005d20124d666d2964bccada1c171cb57e17c64555147b0ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ee5e5f4f56ae11d8f714c42b1c1e6044b04fa5cf4c353d6bfd08c9f3b9e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:44Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.751964 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a27c33baca44df058fd006a35ff882b94f64446256b26a7bcde4c22671a5ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:44Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.768908 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:44Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.782869 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f662826c-e772-4f56-a85f-83971aaef356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b262eb3201301d5e7cc940314a920dd22bcac685c0d91f412f27647028c7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc66c4529a0875a210fbc2b5af96db36ab208d651068958075de46898c1e150\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q6h76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:44Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.798492 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tsdt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb2b1d65-2472-468b-8995-eb1943a63a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e98c56ccc33119647035cca69beb3a8def018e7bb0cae463c6910e18e0f64a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tsdt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:44Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.810792 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.810832 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.810843 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.810861 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.810873 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:44Z","lastTransitionTime":"2025-10-02T09:33:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.811961 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1756cb2-0822-4b79-b527-99dad76e6b19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4650b87cfdd4d4075b3c3c3a7417d8ae6c01222820fa783143fcf963904a9215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247e21bc25ecb430b611fbc0d7260465b80365a76340c4d89652a0e0affdc39a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09276f774f0a48df9df5b99cce34366b7113c49a0c9e8e269996127d93c6015d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a875abefffed9954368ae8e08d3101ae90708e53f82e7eaa447d6164e41a446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a875abefffed9954368ae8e08d3101ae90708e53f82e7eaa447d6164e41a446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:44Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.827362 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hm66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9384fcdadf3735b0de65edc5af7e3d88b55d03043291aea0bc8cc8de945e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7k768\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hm66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:44Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.849183 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zwlns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04c84986-75a1-4683-8764-8f7307d1126a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae2e270e209af7e6afbf96e995421ccb4fd0aa09be5011bb0458307a8d5788a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://460204a345893172e65dd3415be30a96dd48d7bce44d05971133cde3e43939af\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T09:33:37Z\\\",\\\"message\\\":\\\"2025-10-02T09:32:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6d22107e-3534-4c9e-95a9-9516dceffe8f\\\\n2025-10-02T09:32:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6d22107e-3534-4c9e-95a9-9516dceffe8f to /host/opt/cni/bin/\\\\n2025-10-02T09:32:52Z [verbose] multus-daemon started\\\\n2025-10-02T09:32:52Z [verbose] Readiness Indicator file check\\\\n2025-10-02T09:33:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghrmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zwlns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:44Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.865420 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a56e3a0-fea1-4ca5-9d2a-b0a28e16d684\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://737b3f37862e455f181b6864cab54bb6dae5c458ed9dcb9b7fda63a06d0f17fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb6cb31ce28cc0d6095e7f27187bdeb1c2ee2d3f19ac480c95ab8116bd776bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01dd91984bea279379c038d2fbb4a4ad45200b49443071e290c0bf6854af1ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df1cccd0fca2c109ddcdd7a8dbbdeda8524b30c01769c733e4ac6ffb6fde621d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 09:32:46.368533 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 09:32:46.368793 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 09:32:46.369600 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1554788403/tls.crt::/tmp/serving-cert-1554788403/tls.key\\\\\\\"\\\\nI1002 09:32:46.682405 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 09:32:46.685049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 09:32:46.685073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 09:32:46.685132 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 09:32:46.685139 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 09:32:46.690261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 09:32:46.690286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 09:32:46.690299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 09:32:46.690303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 09:32:46.690306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 09:32:46.690478 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 09:32:46.692277 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3b1c06b43a4fb138588fe71ad3f37dae41d5de647684ab490a2f4079018ac06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:44Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.884056 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://562dcc33d7454113e3dec145a8b2faba06d442892ac65301a5713c82e3ca73e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:44Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.898370 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sjg5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc362a8f-ced8-4af3-b908-f856e59845fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece7756234b7c8bc4d84fe1c8e1ed2c722cdcca266f4cd32d29cd5ff0df57b22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61c0fa498ca64b09ac096d8d7c72a5e2e6e95e2297e646ef2c6fd5bf0a0e60fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:33:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sjg5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:44Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.912178 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jn9jw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d43b03f-1f3f-4824-9eda-93e405f6ee9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgg5w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgg5w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jn9jw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:44Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.914613 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.914670 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.914683 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.914726 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.914753 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:44Z","lastTransitionTime":"2025-10-02T09:33:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.929448 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8589e9ec-e8ad-4b32-8846-841b689971b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf821bcfda8c6b7f9823ecb4a9dc7844eae7b1005d41e7bfe9b47df5d66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664f4e51f677aedd16cbcaed322e25d228c87936e30996f36dd3a5b97a43def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14f04556a2e11720ac3e92ec6f1c436468001e87b2984d007f9510a0e6ff0c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546efffc2572c644b7c949abadbaa2f8c9cc0449e08bb5615d3de0504f1bb9e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:44Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.945753 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:44Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.958045 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:44Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:44 crc kubenswrapper[4722]: I1002 09:33:44.987236 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://523bd1dd360fb1d2b7a23339cf39f643b7dbd12a237ec876e659b1a174797d90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://523bd1dd360fb1d2b7a23339cf39f643b7dbd12a237ec876e659b1a174797d90\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T09:33:42Z\\\",\\\"message\\\":\\\"troller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 09:33:42.672774 6729 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 09:33:42.672857 6729 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 09:33:42.672929 6729 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 09:33:42.673214 6729 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 09:33:42.673850 6729 factory.go:656] Stopping watch factory\\\\nI1002 09:33:42.714471 6729 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1002 09:33:42.714503 6729 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1002 09:33:42.714596 6729 ovnkube.go:599] Stopped ovnkube\\\\nI1002 09:33:42.714617 6729 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1002 09:33:42.714696 6729 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:33:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jlvpl_openshift-ovn-kubernetes(a900403a-9c51-4ca5-862f-2fbbd30cbcc3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jlvpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:44Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.007513 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3945e65-623a-40eb-82ae-8b17bcf8827b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497de5e5e05ba246ef4a053b62e09441e5e43342b0a61f446d7dabe2066b40b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cc2g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:45Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.018407 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.018475 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.018499 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.018528 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.018548 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:45Z","lastTransitionTime":"2025-10-02T09:33:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.121487 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.121543 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.121556 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.121582 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.121593 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:45Z","lastTransitionTime":"2025-10-02T09:33:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.224654 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.224711 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.224750 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.224771 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.224785 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:45Z","lastTransitionTime":"2025-10-02T09:33:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.326903 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.326984 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.327001 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.327036 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.327053 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:45Z","lastTransitionTime":"2025-10-02T09:33:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.429829 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.429889 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.429898 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.429914 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.429923 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:45Z","lastTransitionTime":"2025-10-02T09:33:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.532770 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.532834 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.532845 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.532860 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.532872 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:45Z","lastTransitionTime":"2025-10-02T09:33:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.636346 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.636424 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.636443 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.636479 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.636500 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:45Z","lastTransitionTime":"2025-10-02T09:33:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.739018 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.739052 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.739060 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.739075 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.739082 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:45Z","lastTransitionTime":"2025-10-02T09:33:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.841600 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.841657 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.841675 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.841691 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.841725 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:45Z","lastTransitionTime":"2025-10-02T09:33:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.967197 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.967773 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.967865 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.967962 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:45 crc kubenswrapper[4722]: I1002 09:33:45.968049 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:45Z","lastTransitionTime":"2025-10-02T09:33:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.070415 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.070453 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.070464 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.070480 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.070493 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:46Z","lastTransitionTime":"2025-10-02T09:33:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.172872 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.173114 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.173191 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.173266 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.173352 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:46Z","lastTransitionTime":"2025-10-02T09:33:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.275577 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.275608 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.275617 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.275631 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.275639 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:46Z","lastTransitionTime":"2025-10-02T09:33:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.377358 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.377400 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.377412 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.377428 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.377438 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:46Z","lastTransitionTime":"2025-10-02T09:33:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.479981 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.480011 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.480020 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.480032 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.480041 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:46Z","lastTransitionTime":"2025-10-02T09:33:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.582211 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.582240 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.582249 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.582261 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.582269 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:46Z","lastTransitionTime":"2025-10-02T09:33:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.684548 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.684589 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.684598 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.684617 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.684628 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:46Z","lastTransitionTime":"2025-10-02T09:33:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.709516 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.709568 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.709538 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.709524 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:33:46 crc kubenswrapper[4722]: E1002 09:33:46.709666 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:33:46 crc kubenswrapper[4722]: E1002 09:33:46.709763 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:33:46 crc kubenswrapper[4722]: E1002 09:33:46.709902 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:33:46 crc kubenswrapper[4722]: E1002 09:33:46.710045 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.787131 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.787173 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.787181 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.787196 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.787205 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:46Z","lastTransitionTime":"2025-10-02T09:33:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.889303 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.889369 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.889385 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.889425 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.889435 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:46Z","lastTransitionTime":"2025-10-02T09:33:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.991560 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.991616 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.991625 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.991641 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:46 crc kubenswrapper[4722]: I1002 09:33:46.991671 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:46Z","lastTransitionTime":"2025-10-02T09:33:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:47 crc kubenswrapper[4722]: I1002 09:33:47.094382 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:47 crc kubenswrapper[4722]: I1002 09:33:47.094721 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:47 crc kubenswrapper[4722]: I1002 09:33:47.094820 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:47 crc kubenswrapper[4722]: I1002 09:33:47.094924 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:47 crc kubenswrapper[4722]: I1002 09:33:47.095008 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:47Z","lastTransitionTime":"2025-10-02T09:33:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:47 crc kubenswrapper[4722]: I1002 09:33:47.197807 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:47 crc kubenswrapper[4722]: I1002 09:33:47.197858 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:47 crc kubenswrapper[4722]: I1002 09:33:47.197867 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:47 crc kubenswrapper[4722]: I1002 09:33:47.197882 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:47 crc kubenswrapper[4722]: I1002 09:33:47.197891 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:47Z","lastTransitionTime":"2025-10-02T09:33:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:47 crc kubenswrapper[4722]: I1002 09:33:47.300888 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:47 crc kubenswrapper[4722]: I1002 09:33:47.300922 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:47 crc kubenswrapper[4722]: I1002 09:33:47.300931 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:47 crc kubenswrapper[4722]: I1002 09:33:47.300944 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:47 crc kubenswrapper[4722]: I1002 09:33:47.300953 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:47Z","lastTransitionTime":"2025-10-02T09:33:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:47 crc kubenswrapper[4722]: I1002 09:33:47.403631 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:47 crc kubenswrapper[4722]: I1002 09:33:47.403675 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:47 crc kubenswrapper[4722]: I1002 09:33:47.403684 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:47 crc kubenswrapper[4722]: I1002 09:33:47.403699 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:47 crc kubenswrapper[4722]: I1002 09:33:47.403709 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:47Z","lastTransitionTime":"2025-10-02T09:33:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:47 crc kubenswrapper[4722]: I1002 09:33:47.505898 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:47 crc kubenswrapper[4722]: I1002 09:33:47.505932 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:47 crc kubenswrapper[4722]: I1002 09:33:47.505946 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:47 crc kubenswrapper[4722]: I1002 09:33:47.505969 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:47 crc kubenswrapper[4722]: I1002 09:33:47.505982 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:47Z","lastTransitionTime":"2025-10-02T09:33:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:47 crc kubenswrapper[4722]: I1002 09:33:47.608474 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:47 crc kubenswrapper[4722]: I1002 09:33:47.608508 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:47 crc kubenswrapper[4722]: I1002 09:33:47.608518 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:47 crc kubenswrapper[4722]: I1002 09:33:47.608532 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:47 crc kubenswrapper[4722]: I1002 09:33:47.608542 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:47Z","lastTransitionTime":"2025-10-02T09:33:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:47 crc kubenswrapper[4722]: I1002 09:33:47.711760 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:47 crc kubenswrapper[4722]: I1002 09:33:47.711827 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:47 crc kubenswrapper[4722]: I1002 09:33:47.711838 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:47 crc kubenswrapper[4722]: I1002 09:33:47.711874 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:47 crc kubenswrapper[4722]: I1002 09:33:47.711887 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:47Z","lastTransitionTime":"2025-10-02T09:33:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:47 crc kubenswrapper[4722]: I1002 09:33:47.814594 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:47 crc kubenswrapper[4722]: I1002 09:33:47.814634 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:47 crc kubenswrapper[4722]: I1002 09:33:47.814643 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:47 crc kubenswrapper[4722]: I1002 09:33:47.814662 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:47 crc kubenswrapper[4722]: I1002 09:33:47.814672 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:47Z","lastTransitionTime":"2025-10-02T09:33:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:47 crc kubenswrapper[4722]: I1002 09:33:47.917775 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:47 crc kubenswrapper[4722]: I1002 09:33:47.918378 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:47 crc kubenswrapper[4722]: I1002 09:33:47.918458 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:47 crc kubenswrapper[4722]: I1002 09:33:47.918529 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:47 crc kubenswrapper[4722]: I1002 09:33:47.918605 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:47Z","lastTransitionTime":"2025-10-02T09:33:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.021911 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.021950 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.021960 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.021976 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.022029 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:48Z","lastTransitionTime":"2025-10-02T09:33:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.123785 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.124169 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.124246 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.124338 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.124460 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:48Z","lastTransitionTime":"2025-10-02T09:33:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.226765 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.226799 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.226807 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.226820 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.226830 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:48Z","lastTransitionTime":"2025-10-02T09:33:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.329063 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.329815 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.329838 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.329856 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.329866 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:48Z","lastTransitionTime":"2025-10-02T09:33:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.432699 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.432776 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.432787 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.432805 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.432815 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:48Z","lastTransitionTime":"2025-10-02T09:33:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.535286 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.535345 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.535356 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.535370 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.535380 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:48Z","lastTransitionTime":"2025-10-02T09:33:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.638259 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.638358 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.638372 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.638393 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.638406 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:48Z","lastTransitionTime":"2025-10-02T09:33:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.709410 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.709419 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.709593 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:33:48 crc kubenswrapper[4722]: E1002 09:33:48.709736 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.709753 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:33:48 crc kubenswrapper[4722]: E1002 09:33:48.709916 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:33:48 crc kubenswrapper[4722]: E1002 09:33:48.709984 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:33:48 crc kubenswrapper[4722]: E1002 09:33:48.710114 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.741435 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.741488 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.741504 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.741524 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.741540 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:48Z","lastTransitionTime":"2025-10-02T09:33:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.848856 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.848939 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.848954 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.848975 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.848989 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:48Z","lastTransitionTime":"2025-10-02T09:33:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.951482 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.951543 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.951556 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.951577 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:48 crc kubenswrapper[4722]: I1002 09:33:48.951591 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:48Z","lastTransitionTime":"2025-10-02T09:33:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.055247 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.055295 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.055306 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.055354 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.055367 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:49Z","lastTransitionTime":"2025-10-02T09:33:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.159686 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.159793 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.159809 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.159832 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.159846 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:49Z","lastTransitionTime":"2025-10-02T09:33:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.263737 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.263799 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.263810 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.263830 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.263842 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:49Z","lastTransitionTime":"2025-10-02T09:33:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.366651 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.366700 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.366714 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.366731 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.366744 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:49Z","lastTransitionTime":"2025-10-02T09:33:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.469458 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.469501 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.469515 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.469537 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.469555 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:49Z","lastTransitionTime":"2025-10-02T09:33:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.571222 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.571257 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.571271 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.571296 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.571310 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:49Z","lastTransitionTime":"2025-10-02T09:33:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.673973 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.674026 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.674039 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.674062 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.674079 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:49Z","lastTransitionTime":"2025-10-02T09:33:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.726100 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.776618 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.776661 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.776670 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.776684 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.776696 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:49Z","lastTransitionTime":"2025-10-02T09:33:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.880298 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.880383 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.880397 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.880423 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.880441 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:49Z","lastTransitionTime":"2025-10-02T09:33:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.984620 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.984718 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.984747 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.984788 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:49 crc kubenswrapper[4722]: I1002 09:33:49.984811 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:49Z","lastTransitionTime":"2025-10-02T09:33:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.088001 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.088048 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.088058 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.088078 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.088091 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:50Z","lastTransitionTime":"2025-10-02T09:33:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.190917 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.190978 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.190997 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.191023 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.191041 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:50Z","lastTransitionTime":"2025-10-02T09:33:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.329626 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.329677 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.329687 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.329704 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.329717 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:50Z","lastTransitionTime":"2025-10-02T09:33:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.432758 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.432814 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.432826 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.432842 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.432855 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:50Z","lastTransitionTime":"2025-10-02T09:33:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.479138 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:33:50 crc kubenswrapper[4722]: E1002 09:33:50.479416 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:54.479397741 +0000 UTC m=+150.516150721 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.534927 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.534971 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.534985 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.535000 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.535010 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:50Z","lastTransitionTime":"2025-10-02T09:33:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.580761 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.580882 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.580923 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.580958 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:33:50 crc kubenswrapper[4722]: E1002 09:33:50.580993 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 09:33:50 crc kubenswrapper[4722]: E1002 09:33:50.581092 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 09:34:54.581062683 +0000 UTC m=+150.617815663 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 09:33:50 crc kubenswrapper[4722]: E1002 09:33:50.581164 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 09:33:50 crc kubenswrapper[4722]: E1002 09:33:50.581204 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 09:33:50 crc kubenswrapper[4722]: E1002 09:33:50.581230 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 09:33:50 crc kubenswrapper[4722]: E1002 09:33:50.581249 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 09:33:50 crc kubenswrapper[4722]: E1002 09:33:50.581270 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 09:34:54.581243428 +0000 UTC m=+150.617996458 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 09:33:50 crc kubenswrapper[4722]: E1002 09:33:50.581348 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 09:33:50 crc kubenswrapper[4722]: E1002 09:33:50.581437 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 09:33:50 crc kubenswrapper[4722]: E1002 09:33:50.581463 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 09:33:50 crc kubenswrapper[4722]: E1002 09:33:50.581364 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 09:34:54.58130069 +0000 UTC m=+150.618053710 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 09:33:50 crc kubenswrapper[4722]: E1002 09:33:50.581597 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 09:34:54.581569717 +0000 UTC m=+150.618322867 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.637235 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.637295 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.637340 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.637363 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.637390 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:50Z","lastTransitionTime":"2025-10-02T09:33:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.709612 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.709731 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.709731 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:33:50 crc kubenswrapper[4722]: E1002 09:33:50.709835 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:33:50 crc kubenswrapper[4722]: E1002 09:33:50.710010 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.710057 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:33:50 crc kubenswrapper[4722]: E1002 09:33:50.710142 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:33:50 crc kubenswrapper[4722]: E1002 09:33:50.710191 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.740055 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.740118 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.740129 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.740148 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.740160 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:50Z","lastTransitionTime":"2025-10-02T09:33:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.843435 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.843466 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.843473 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.843487 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.843495 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:50Z","lastTransitionTime":"2025-10-02T09:33:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.946911 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.946970 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.946984 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.947007 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:50 crc kubenswrapper[4722]: I1002 09:33:50.947023 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:50Z","lastTransitionTime":"2025-10-02T09:33:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.049922 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.049963 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.049990 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.050008 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.050019 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:51Z","lastTransitionTime":"2025-10-02T09:33:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.152509 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.152545 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.152552 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.152565 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.152574 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:51Z","lastTransitionTime":"2025-10-02T09:33:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.255522 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.255881 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.255895 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.255914 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.255926 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:51Z","lastTransitionTime":"2025-10-02T09:33:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.358096 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.358143 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.358153 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.358170 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.358181 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:51Z","lastTransitionTime":"2025-10-02T09:33:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.460476 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.460522 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.460531 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.460546 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.460557 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:51Z","lastTransitionTime":"2025-10-02T09:33:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.563467 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.563509 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.563544 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.563559 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.563569 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:51Z","lastTransitionTime":"2025-10-02T09:33:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.665486 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.665527 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.665565 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.665583 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.665593 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:51Z","lastTransitionTime":"2025-10-02T09:33:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.768397 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.768426 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.768471 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.768487 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.768496 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:51Z","lastTransitionTime":"2025-10-02T09:33:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.871114 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.871154 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.871171 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.871186 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.871196 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:51Z","lastTransitionTime":"2025-10-02T09:33:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.973426 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.973459 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.973467 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.973482 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:51 crc kubenswrapper[4722]: I1002 09:33:51.973491 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:51Z","lastTransitionTime":"2025-10-02T09:33:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.075811 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.075840 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.075849 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.075863 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.075871 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:52Z","lastTransitionTime":"2025-10-02T09:33:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.178677 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.178734 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.178744 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.178760 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.178769 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:52Z","lastTransitionTime":"2025-10-02T09:33:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.280574 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.280627 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.280652 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.280668 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.280679 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:52Z","lastTransitionTime":"2025-10-02T09:33:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.382669 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.382698 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.382707 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.382735 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.382746 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:52Z","lastTransitionTime":"2025-10-02T09:33:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.484875 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.484927 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.484937 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.484952 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.484962 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:52Z","lastTransitionTime":"2025-10-02T09:33:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.587266 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.587305 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.587449 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.587467 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.587476 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:52Z","lastTransitionTime":"2025-10-02T09:33:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.690702 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.690752 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.690762 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.690779 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.690791 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:52Z","lastTransitionTime":"2025-10-02T09:33:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.709417 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.709522 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.709535 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.709471 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:33:52 crc kubenswrapper[4722]: E1002 09:33:52.709625 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:33:52 crc kubenswrapper[4722]: E1002 09:33:52.709688 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:33:52 crc kubenswrapper[4722]: E1002 09:33:52.709817 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:33:52 crc kubenswrapper[4722]: E1002 09:33:52.709925 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.792975 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.793009 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.793020 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.793033 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.793041 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:52Z","lastTransitionTime":"2025-10-02T09:33:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.895060 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.895096 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.895107 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.895124 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.895134 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:52Z","lastTransitionTime":"2025-10-02T09:33:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.997135 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.997164 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.997172 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.997184 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:52 crc kubenswrapper[4722]: I1002 09:33:52.997194 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:52Z","lastTransitionTime":"2025-10-02T09:33:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.099474 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.099508 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.099522 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.099537 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.099546 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:53Z","lastTransitionTime":"2025-10-02T09:33:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.201617 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.201666 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.201682 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.201705 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.201723 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:53Z","lastTransitionTime":"2025-10-02T09:33:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.304078 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.304126 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.304137 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.304154 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.304166 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:53Z","lastTransitionTime":"2025-10-02T09:33:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.405695 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.405745 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.405755 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.405770 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.405779 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:53Z","lastTransitionTime":"2025-10-02T09:33:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.509097 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.509143 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.509156 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.509174 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.509190 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:53Z","lastTransitionTime":"2025-10-02T09:33:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.611440 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.611488 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.611499 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.611512 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.611537 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:53Z","lastTransitionTime":"2025-10-02T09:33:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.714212 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.714248 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.714256 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.714270 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.714279 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:53Z","lastTransitionTime":"2025-10-02T09:33:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.816686 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.816723 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.816748 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.816764 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.816776 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:53Z","lastTransitionTime":"2025-10-02T09:33:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.919600 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.919651 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.919660 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.919675 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.919684 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:53Z","lastTransitionTime":"2025-10-02T09:33:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.922605 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.922645 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.922656 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.922671 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.922679 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:53Z","lastTransitionTime":"2025-10-02T09:33:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:53 crc kubenswrapper[4722]: E1002 09:33:53.936844 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"680c7bd3-d5b0-4b5c-92c1-2fb03a628b44\\\",\\\"systemUUID\\\":\\\"8b109f7e-b539-4abd-be49-fd9e033b3339\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:53Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.941898 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.941944 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.941958 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.941982 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.941998 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:53Z","lastTransitionTime":"2025-10-02T09:33:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:53 crc kubenswrapper[4722]: E1002 09:33:53.955251 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"680c7bd3-d5b0-4b5c-92c1-2fb03a628b44\\\",\\\"systemUUID\\\":\\\"8b109f7e-b539-4abd-be49-fd9e033b3339\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:53Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.959649 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.959703 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.959717 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.959742 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.959760 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:53Z","lastTransitionTime":"2025-10-02T09:33:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:53 crc kubenswrapper[4722]: E1002 09:33:53.975024 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"680c7bd3-d5b0-4b5c-92c1-2fb03a628b44\\\",\\\"systemUUID\\\":\\\"8b109f7e-b539-4abd-be49-fd9e033b3339\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:53Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.979541 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.979588 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.979604 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.979630 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.979648 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:53Z","lastTransitionTime":"2025-10-02T09:33:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:53 crc kubenswrapper[4722]: E1002 09:33:53.994389 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"680c7bd3-d5b0-4b5c-92c1-2fb03a628b44\\\",\\\"systemUUID\\\":\\\"8b109f7e-b539-4abd-be49-fd9e033b3339\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:53Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.999026 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.999081 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.999101 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.999131 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:53 crc kubenswrapper[4722]: I1002 09:33:53.999145 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:53Z","lastTransitionTime":"2025-10-02T09:33:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:54 crc kubenswrapper[4722]: E1002 09:33:54.015372 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:33:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"680c7bd3-d5b0-4b5c-92c1-2fb03a628b44\\\",\\\"systemUUID\\\":\\\"8b109f7e-b539-4abd-be49-fd9e033b3339\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:54Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:54 crc kubenswrapper[4722]: E1002 09:33:54.015548 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.022138 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.022173 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.022185 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.022204 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.022220 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:54Z","lastTransitionTime":"2025-10-02T09:33:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.124853 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.124897 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.124907 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.124946 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.124960 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:54Z","lastTransitionTime":"2025-10-02T09:33:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.227729 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.227781 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.227791 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.227812 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.227822 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:54Z","lastTransitionTime":"2025-10-02T09:33:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.330731 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.330779 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.330792 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.330809 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.330822 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:54Z","lastTransitionTime":"2025-10-02T09:33:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.433531 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.433568 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.433577 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.433594 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.433604 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:54Z","lastTransitionTime":"2025-10-02T09:33:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.536840 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.536895 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.536904 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.536925 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.536935 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:54Z","lastTransitionTime":"2025-10-02T09:33:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.639985 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.640046 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.640057 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.640075 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.640103 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:54Z","lastTransitionTime":"2025-10-02T09:33:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.710263 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.710371 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:33:54 crc kubenswrapper[4722]: E1002 09:33:54.710492 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.710529 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.710554 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:33:54 crc kubenswrapper[4722]: E1002 09:33:54.710729 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:33:54 crc kubenswrapper[4722]: E1002 09:33:54.711467 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:33:54 crc kubenswrapper[4722]: E1002 09:33:54.711572 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.721446 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jn9jw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d43b03f-1f3f-4824-9eda-93e405f6ee9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgg5w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgg5w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jn9jw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:54Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.735733 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8589e9ec-e8ad-4b32-8846-841b689971b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf821bcfda8c6b7f9823ecb4a9dc7844eae7b1005d41e7bfe9b47df5d66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664f4e51f677aedd16cbcaed322e25d228c87936e30996f36dd3a5b97a43def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14f04556a2e11720ac3e92ec6f1c436468001e87b2984d007f9510a0e6ff0c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546efffc2572c644b7c949abadbaa2f8c9cc0449e08bb5615d3de0504f1bb9e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:54Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.742287 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.742331 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.742380 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.742410 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.742451 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:54Z","lastTransitionTime":"2025-10-02T09:33:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.750245 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://562dcc33d7454113e3dec145a8b2faba06d442892ac65301a5713c82e3ca73e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:54Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.775112 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sjg5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc362a8f-ced8-4af3-b908-f856e59845fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece7756234b7c8bc4d84fe1c8e1ed2c722cdcca266f4cd32d29cd5ff0df57b22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61c0fa498ca64b09ac096d8d7c72a5e2e6e95e2297e646ef2c6fd5bf0a0e60fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:33:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sjg5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:54Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.795325 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://523bd1dd360fb1d2b7a23339cf39f643b7dbd12a237ec876e659b1a174797d90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://523bd1dd360fb1d2b7a23339cf39f643b7dbd12a237ec876e659b1a174797d90\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T09:33:42Z\\\",\\\"message\\\":\\\"troller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 09:33:42.672774 6729 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 09:33:42.672857 6729 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 09:33:42.672929 6729 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 09:33:42.673214 6729 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 09:33:42.673850 6729 factory.go:656] Stopping watch factory\\\\nI1002 09:33:42.714471 6729 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1002 09:33:42.714503 6729 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1002 09:33:42.714596 6729 ovnkube.go:599] Stopped ovnkube\\\\nI1002 09:33:42.714617 6729 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1002 09:33:42.714696 6729 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:33:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jlvpl_openshift-ovn-kubernetes(a900403a-9c51-4ca5-862f-2fbbd30cbcc3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pfvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jlvpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:54Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.810093 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3945e65-623a-40eb-82ae-8b17bcf8827b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497de5e5e05ba246ef4a053b62e09441e5e43342b0a61f446d7dabe2066b40b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7232dfd8fe1b07842effb04b2ae370344d51627cc727be7af312e45fa83a2b4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece929e9ad27092137bf82dc101ea1ed39578522354cd75c0851fb376ce65b97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da0ce8993fd8ba3d69ac20a1eb0f376802613c5b77c01669d2c56c8c92fda93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aae4345e83b425a0fb36162fa05b9c6d6eeb9dc955241685306354be28d5b99c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da056a17d9ad5b6d87764aa08734b1d55ce0259bb082147a34262665fc9700a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1106e7493902f4827ea293fd6a7291b3049a60ee44375b79a7afa11d5c862235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49vfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cc2g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:54Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.830795 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc3e6ea1-f33f-4d5b-8b20-421ffe42689b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87be6d5b772e600788aafb9be35368a8ca62bbb1d2553d4ac9bf8ab684c107c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30003d3713199085ff1a79d8ea9e66d2d6f787689e072f639d611c6ccf41f8ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df2d4d186bea7154dd49391b5a32f6dae60214d08a4f0efc3ba770b950d602f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae5e7b6ece8ad6107bea24b258aa605bdedb34601326885a23c7d26e9c85e0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a9de4346b8278b90978ca17be9cb2d7141d076585875086061cd940b70b9842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16af577303472240f6be1eb0175cae33d92df4e7965dcc21d449e5426cde3cc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16af577303472240f6be1eb0175cae33d92df4e7965dcc21d449e5426cde3cc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e52fd01cec3feb2e23116be0ee802fff889e44f5b835b09ce15cce2d607d2c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e52fd01cec3feb2e23116be0ee802fff889e44f5b835b09ce15cce2d607d2c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8cb6a1648af34bd7f97bbe60686c2d57c6b8e9ea5b15aba7ca25460565febbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cb6a1648af34bd7f97bbe60686c2d57c6b8e9ea5b15aba7ca25460565febbd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:54Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.846215 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:54Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.846256 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.846373 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.846386 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.846408 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.846422 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:54Z","lastTransitionTime":"2025-10-02T09:33:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.864686 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:54Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.878586 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a27c33baca44df058fd006a35ff882b94f64446256b26a7bcde4c22671a5ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:54Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.892417 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:54Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.903938 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f662826c-e772-4f56-a85f-83971aaef356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b262eb3201301d5e7cc940314a920dd22bcac685c0d91f412f27647028c7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc66c4529a0875a210fbc2b5af96db36ab208d651068958075de46898c1e150\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q6h76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:54Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.915132 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tsdt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb2b1d65-2472-468b-8995-eb1943a63a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e98c56ccc33119647035cca69beb3a8def018e7bb0cae463c6910e18e0f64a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tsdt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:54Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.928165 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1756cb2-0822-4b79-b527-99dad76e6b19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4650b87cfdd4d4075b3c3c3a7417d8ae6c01222820fa783143fcf963904a9215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247e21bc25ecb430b611fbc0d7260465b80365a76340c4d89652a0e0affdc39a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09276f774f0a48df9df5b99cce34366b7113c49a0c9e8e269996127d93c6015d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a875abefffed9954368ae8e08d3101ae90708e53f82e7eaa447d6164e41a446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a875abefffed9954368ae8e08d3101ae90708e53f82e7eaa447d6164e41a446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:54Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.940636 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dec5204-e2c5-43eb-8b80-80b74c9aa806\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beaa5e380cbd961e9e06c7aad0f3ff2f42c43eab5e89b6cf779003c490baf018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1be9f4d5574e381c95d2f7e44f20e96db3e586f66a380f79ab89422112bbfb35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1be9f4d5574e381c95d2f7e44f20e96db3e586f66a380f79ab89422112bbfb35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:54Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.948288 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.948467 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.948495 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.948511 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.948522 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:54Z","lastTransitionTime":"2025-10-02T09:33:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.954076 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://059f5c846149b9005d20124d666d2964bccada1c171cb57e17c64555147b0ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ee5e5f4f56ae11d8f714c42b1c1e6044b04fa5cf4c353d6bfd08c9f3b9e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:54Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.966844 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a56e3a0-fea1-4ca5-9d2a-b0a28e16d684\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://737b3f37862e455f181b6864cab54bb6dae5c458ed9dcb9b7fda63a06d0f17fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb6cb31ce28cc0d6095e7f27187bdeb1c2ee2d3f19ac480c95ab8116bd776bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01dd91984bea279379c038d2fbb4a4ad45200b49443071e290c0bf6854af1ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df1cccd0fca2c109ddcdd7a8dbbdeda8524b30c01769c733e4ac6ffb6fde621d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 09:32:46.368533 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 09:32:46.368793 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 09:32:46.369600 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1554788403/tls.crt::/tmp/serving-cert-1554788403/tls.key\\\\\\\"\\\\nI1002 09:32:46.682405 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 09:32:46.685049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 09:32:46.685073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 09:32:46.685132 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 09:32:46.685139 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 09:32:46.690261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 09:32:46.690286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 09:32:46.690299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 09:32:46.690303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 09:32:46.690306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 09:32:46.690478 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 09:32:46.692277 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3b1c06b43a4fb138588fe71ad3f37dae41d5de647684ab490a2f4079018ac06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:54Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.976364 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hm66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f7c5a4b-6be1-42da-abd5-039f84d7d6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9384fcdadf3735b0de65edc5af7e3d88b55d03043291aea0bc8cc8de945e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7k768\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hm66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:54Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:54 crc kubenswrapper[4722]: I1002 09:33:54.988412 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zwlns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04c84986-75a1-4683-8764-8f7307d1126a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae2e270e209af7e6afbf96e995421ccb4fd0aa09be5011bb0458307a8d5788a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://460204a345893172e65dd3415be30a96dd48d7bce44d05971133cde3e43939af\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T09:33:37Z\\\",\\\"message\\\":\\\"2025-10-02T09:32:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6d22107e-3534-4c9e-95a9-9516dceffe8f\\\\n2025-10-02T09:32:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6d22107e-3534-4c9e-95a9-9516dceffe8f to /host/opt/cni/bin/\\\\n2025-10-02T09:32:52Z [verbose] multus-daemon started\\\\n2025-10-02T09:32:52Z [verbose] Readiness Indicator file check\\\\n2025-10-02T09:33:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghrmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zwlns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:33:54Z is after 2025-08-24T17:21:41Z" Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.054238 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.054282 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.054292 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.054307 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.054341 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:55Z","lastTransitionTime":"2025-10-02T09:33:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.157027 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.157063 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.157074 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.157089 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.157101 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:55Z","lastTransitionTime":"2025-10-02T09:33:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.259613 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.259650 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.259661 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.259675 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.259683 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:55Z","lastTransitionTime":"2025-10-02T09:33:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.362104 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.362144 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.362156 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.362173 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.362185 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:55Z","lastTransitionTime":"2025-10-02T09:33:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.464452 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.464511 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.464526 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.464542 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.464668 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:55Z","lastTransitionTime":"2025-10-02T09:33:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.566759 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.566797 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.566807 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.566821 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.566831 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:55Z","lastTransitionTime":"2025-10-02T09:33:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.668951 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.668988 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.668996 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.669013 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.669021 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:55Z","lastTransitionTime":"2025-10-02T09:33:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.771150 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.771192 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.771204 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.771219 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.771230 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:55Z","lastTransitionTime":"2025-10-02T09:33:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.873891 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.873959 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.873981 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.874006 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.874024 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:55Z","lastTransitionTime":"2025-10-02T09:33:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.976988 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.977064 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.977074 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.977088 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:55 crc kubenswrapper[4722]: I1002 09:33:55.977099 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:55Z","lastTransitionTime":"2025-10-02T09:33:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.079768 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.079817 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.079827 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.079844 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.079854 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:56Z","lastTransitionTime":"2025-10-02T09:33:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.182167 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.182227 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.182242 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.182255 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.182265 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:56Z","lastTransitionTime":"2025-10-02T09:33:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.285034 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.285089 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.285106 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.285127 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.285142 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:56Z","lastTransitionTime":"2025-10-02T09:33:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.387137 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.387169 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.387177 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.387191 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.387201 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:56Z","lastTransitionTime":"2025-10-02T09:33:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.489948 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.489993 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.490002 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.490019 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.490029 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:56Z","lastTransitionTime":"2025-10-02T09:33:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.593103 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.593147 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.593156 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.593171 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.593180 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:56Z","lastTransitionTime":"2025-10-02T09:33:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.696641 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.696687 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.696697 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.696716 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.696739 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:56Z","lastTransitionTime":"2025-10-02T09:33:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.709619 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.709746 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.709760 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:33:56 crc kubenswrapper[4722]: E1002 09:33:56.709860 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:33:56 crc kubenswrapper[4722]: E1002 09:33:56.709953 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:33:56 crc kubenswrapper[4722]: E1002 09:33:56.710067 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.710131 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:33:56 crc kubenswrapper[4722]: E1002 09:33:56.710210 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.799241 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.799281 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.799290 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.799304 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.799334 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:56Z","lastTransitionTime":"2025-10-02T09:33:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.901857 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.901901 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.901911 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.901927 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:56 crc kubenswrapper[4722]: I1002 09:33:56.901937 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:56Z","lastTransitionTime":"2025-10-02T09:33:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.003686 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.003726 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.003736 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.003752 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.003762 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:57Z","lastTransitionTime":"2025-10-02T09:33:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.107371 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.107405 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.107414 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.107429 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.107440 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:57Z","lastTransitionTime":"2025-10-02T09:33:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.210474 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.210513 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.210523 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.210537 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.210547 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:57Z","lastTransitionTime":"2025-10-02T09:33:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.313148 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.313185 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.313194 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.313224 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.313236 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:57Z","lastTransitionTime":"2025-10-02T09:33:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.415425 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.415470 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.415478 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.415491 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.415500 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:57Z","lastTransitionTime":"2025-10-02T09:33:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.518829 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.518892 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.518907 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.518932 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.518957 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:57Z","lastTransitionTime":"2025-10-02T09:33:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.621462 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.621502 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.621511 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.621532 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.621543 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:57Z","lastTransitionTime":"2025-10-02T09:33:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.710809 4722 scope.go:117] "RemoveContainer" containerID="523bd1dd360fb1d2b7a23339cf39f643b7dbd12a237ec876e659b1a174797d90" Oct 02 09:33:57 crc kubenswrapper[4722]: E1002 09:33:57.711064 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jlvpl_openshift-ovn-kubernetes(a900403a-9c51-4ca5-862f-2fbbd30cbcc3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.724738 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.724777 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.724791 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.724810 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.724825 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:57Z","lastTransitionTime":"2025-10-02T09:33:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.826632 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.826667 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.826676 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.826690 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.826698 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:57Z","lastTransitionTime":"2025-10-02T09:33:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.928760 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.928808 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.928819 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.928836 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:57 crc kubenswrapper[4722]: I1002 09:33:57.928848 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:57Z","lastTransitionTime":"2025-10-02T09:33:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.031191 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.031229 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.031239 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.031270 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.031280 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:58Z","lastTransitionTime":"2025-10-02T09:33:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.134092 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.134128 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.134138 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.134152 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.134163 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:58Z","lastTransitionTime":"2025-10-02T09:33:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.236022 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.236086 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.236095 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.236113 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.236125 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:58Z","lastTransitionTime":"2025-10-02T09:33:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.339804 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.339850 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.339861 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.339883 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.339896 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:58Z","lastTransitionTime":"2025-10-02T09:33:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.442474 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.442544 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.442563 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.442591 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.442606 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:58Z","lastTransitionTime":"2025-10-02T09:33:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.545183 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.545219 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.545229 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.545246 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.545258 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:58Z","lastTransitionTime":"2025-10-02T09:33:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.648073 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.648124 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.648133 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.648150 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.648162 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:58Z","lastTransitionTime":"2025-10-02T09:33:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.709422 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.709503 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.709416 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.709416 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:33:58 crc kubenswrapper[4722]: E1002 09:33:58.709636 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:33:58 crc kubenswrapper[4722]: E1002 09:33:58.709754 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:33:58 crc kubenswrapper[4722]: E1002 09:33:58.709905 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:33:58 crc kubenswrapper[4722]: E1002 09:33:58.710009 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.751963 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.752017 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.752032 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.752230 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.752242 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:58Z","lastTransitionTime":"2025-10-02T09:33:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.855309 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.855390 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.855406 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.855422 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.855434 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:58Z","lastTransitionTime":"2025-10-02T09:33:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.957631 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.957669 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.957678 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.957694 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:58 crc kubenswrapper[4722]: I1002 09:33:58.957737 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:58Z","lastTransitionTime":"2025-10-02T09:33:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.061950 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.062022 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.062051 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.062088 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.062113 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:59Z","lastTransitionTime":"2025-10-02T09:33:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.165850 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.165909 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.165930 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.165960 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.165980 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:59Z","lastTransitionTime":"2025-10-02T09:33:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.269575 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.269626 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.269641 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.269658 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.269668 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:59Z","lastTransitionTime":"2025-10-02T09:33:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.373128 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.373221 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.373251 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.373294 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.373348 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:59Z","lastTransitionTime":"2025-10-02T09:33:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.476628 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.476672 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.476686 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.476704 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.476717 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:59Z","lastTransitionTime":"2025-10-02T09:33:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.579599 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.579670 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.579689 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.579722 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.579743 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:59Z","lastTransitionTime":"2025-10-02T09:33:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.682661 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.682713 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.682728 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.682751 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.682765 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:59Z","lastTransitionTime":"2025-10-02T09:33:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.785729 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.785764 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.785775 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.785793 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.785805 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:59Z","lastTransitionTime":"2025-10-02T09:33:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.888025 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.888078 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.888099 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.888118 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.888129 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:59Z","lastTransitionTime":"2025-10-02T09:33:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.990974 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.991086 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.991097 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.991120 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:33:59 crc kubenswrapper[4722]: I1002 09:33:59.991134 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:33:59Z","lastTransitionTime":"2025-10-02T09:33:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.093586 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.093628 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.093637 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.093652 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.093661 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:00Z","lastTransitionTime":"2025-10-02T09:34:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.196765 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.196807 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.196816 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.196831 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.196840 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:00Z","lastTransitionTime":"2025-10-02T09:34:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.298614 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.299005 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.299015 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.299031 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.299040 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:00Z","lastTransitionTime":"2025-10-02T09:34:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.401119 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.401172 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.401187 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.401209 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.401222 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:00Z","lastTransitionTime":"2025-10-02T09:34:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.504136 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.504182 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.504190 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.504206 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.504215 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:00Z","lastTransitionTime":"2025-10-02T09:34:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.607448 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.607501 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.607513 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.607533 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.607548 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:00Z","lastTransitionTime":"2025-10-02T09:34:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.710206 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.710241 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.710250 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.710266 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.710278 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:00Z","lastTransitionTime":"2025-10-02T09:34:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.710657 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.710663 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.710677 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:34:00 crc kubenswrapper[4722]: E1002 09:34:00.710731 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.710773 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:34:00 crc kubenswrapper[4722]: E1002 09:34:00.710819 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:34:00 crc kubenswrapper[4722]: E1002 09:34:00.710855 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:34:00 crc kubenswrapper[4722]: E1002 09:34:00.710930 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.813452 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.813505 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.813514 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.813529 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.813538 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:00Z","lastTransitionTime":"2025-10-02T09:34:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.916944 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.917034 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.917058 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.917099 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:00 crc kubenswrapper[4722]: I1002 09:34:00.917124 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:00Z","lastTransitionTime":"2025-10-02T09:34:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.019200 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.019240 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.019254 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.019271 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.019283 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:01Z","lastTransitionTime":"2025-10-02T09:34:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.122524 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.122575 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.122589 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.122613 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.122629 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:01Z","lastTransitionTime":"2025-10-02T09:34:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.225641 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.225702 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.225712 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.225726 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.225735 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:01Z","lastTransitionTime":"2025-10-02T09:34:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.329221 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.329274 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.329287 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.329340 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.329360 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:01Z","lastTransitionTime":"2025-10-02T09:34:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.432031 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.432103 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.432121 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.432155 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.432178 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:01Z","lastTransitionTime":"2025-10-02T09:34:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.535235 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.535304 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.535357 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.535390 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.535410 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:01Z","lastTransitionTime":"2025-10-02T09:34:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.637286 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.637392 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.637406 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.637422 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.637455 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:01Z","lastTransitionTime":"2025-10-02T09:34:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.740459 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.740508 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.740522 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.740540 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.740552 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:01Z","lastTransitionTime":"2025-10-02T09:34:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.842579 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.842639 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.842651 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.842666 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.842676 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:01Z","lastTransitionTime":"2025-10-02T09:34:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.944405 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.944445 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.944456 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.944471 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:01 crc kubenswrapper[4722]: I1002 09:34:01.944481 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:01Z","lastTransitionTime":"2025-10-02T09:34:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.046828 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.046863 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.046875 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.046891 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.046901 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:02Z","lastTransitionTime":"2025-10-02T09:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.149803 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.149855 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.149866 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.149883 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.149898 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:02Z","lastTransitionTime":"2025-10-02T09:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.252298 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.252367 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.252376 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.252391 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.252400 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:02Z","lastTransitionTime":"2025-10-02T09:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.355048 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.355088 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.355096 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.355114 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.355124 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:02Z","lastTransitionTime":"2025-10-02T09:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.457399 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.457464 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.457473 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.457488 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.457498 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:02Z","lastTransitionTime":"2025-10-02T09:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.560055 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.560102 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.560110 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.560124 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.560134 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:02Z","lastTransitionTime":"2025-10-02T09:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.662236 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.662271 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.662280 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.662292 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.662301 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:02Z","lastTransitionTime":"2025-10-02T09:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.710140 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.710208 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.710158 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:34:02 crc kubenswrapper[4722]: E1002 09:34:02.710271 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.710393 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:34:02 crc kubenswrapper[4722]: E1002 09:34:02.710358 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:34:02 crc kubenswrapper[4722]: E1002 09:34:02.710484 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:34:02 crc kubenswrapper[4722]: E1002 09:34:02.710531 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.764672 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.764731 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.764748 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.764763 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.764776 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:02Z","lastTransitionTime":"2025-10-02T09:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.866805 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.866837 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.866847 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.866862 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.866872 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:02Z","lastTransitionTime":"2025-10-02T09:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.969172 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.969200 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.969210 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.969224 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:02 crc kubenswrapper[4722]: I1002 09:34:02.969233 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:02Z","lastTransitionTime":"2025-10-02T09:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.072210 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.072256 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.072265 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.072287 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.072297 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:03Z","lastTransitionTime":"2025-10-02T09:34:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.175202 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.175291 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.175304 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.175340 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.175350 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:03Z","lastTransitionTime":"2025-10-02T09:34:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.277801 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.277838 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.277846 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.277861 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.277870 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:03Z","lastTransitionTime":"2025-10-02T09:34:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.380415 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.380450 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.380459 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.380474 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.380484 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:03Z","lastTransitionTime":"2025-10-02T09:34:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.482538 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.482579 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.482587 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.482600 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.482610 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:03Z","lastTransitionTime":"2025-10-02T09:34:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.584732 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.584774 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.584782 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.584800 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.584810 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:03Z","lastTransitionTime":"2025-10-02T09:34:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.688563 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.688664 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.688685 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.688743 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.688766 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:03Z","lastTransitionTime":"2025-10-02T09:34:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.791644 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.791675 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.791685 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.791735 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.791750 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:03Z","lastTransitionTime":"2025-10-02T09:34:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.893772 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.893827 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.893842 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.893858 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.893867 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:03Z","lastTransitionTime":"2025-10-02T09:34:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.996070 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.996096 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.996128 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.996140 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:03 crc kubenswrapper[4722]: I1002 09:34:03.996148 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:03Z","lastTransitionTime":"2025-10-02T09:34:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.068478 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.068712 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.068771 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.068818 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.068841 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:04Z","lastTransitionTime":"2025-10-02T09:34:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:04 crc kubenswrapper[4722]: E1002 09:34:04.083084 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:34:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:34:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:34:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:34:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:34:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:34:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:34:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:34:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"680c7bd3-d5b0-4b5c-92c1-2fb03a628b44\\\",\\\"systemUUID\\\":\\\"8b109f7e-b539-4abd-be49-fd9e033b3339\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:34:04Z is after 2025-08-24T17:21:41Z" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.086558 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.086589 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.086600 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.086615 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.086625 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:04Z","lastTransitionTime":"2025-10-02T09:34:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:04 crc kubenswrapper[4722]: E1002 09:34:04.098083 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:34:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:34:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:34:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:34:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:34:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:34:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:34:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:34:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"680c7bd3-d5b0-4b5c-92c1-2fb03a628b44\\\",\\\"systemUUID\\\":\\\"8b109f7e-b539-4abd-be49-fd9e033b3339\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:34:04Z is after 2025-08-24T17:21:41Z" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.101856 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.101897 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.101910 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.101925 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.101936 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:04Z","lastTransitionTime":"2025-10-02T09:34:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:04 crc kubenswrapper[4722]: E1002 09:34:04.115544 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:34:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:34:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:34:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:34:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:34:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:34:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:34:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:34:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"680c7bd3-d5b0-4b5c-92c1-2fb03a628b44\\\",\\\"systemUUID\\\":\\\"8b109f7e-b539-4abd-be49-fd9e033b3339\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:34:04Z is after 2025-08-24T17:21:41Z" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.119853 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.119891 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.119899 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.119915 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.119926 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:04Z","lastTransitionTime":"2025-10-02T09:34:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:04 crc kubenswrapper[4722]: E1002 09:34:04.135817 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:34:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:34:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:34:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:34:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:34:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:34:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:34:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:34:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"680c7bd3-d5b0-4b5c-92c1-2fb03a628b44\\\",\\\"systemUUID\\\":\\\"8b109f7e-b539-4abd-be49-fd9e033b3339\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:34:04Z is after 2025-08-24T17:21:41Z" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.140873 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.140907 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.140915 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.140929 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.140938 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:04Z","lastTransitionTime":"2025-10-02T09:34:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:04 crc kubenswrapper[4722]: E1002 09:34:04.152851 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:34:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:34:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:34:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:34:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:34:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:34:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T09:34:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T09:34:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"680c7bd3-d5b0-4b5c-92c1-2fb03a628b44\\\",\\\"systemUUID\\\":\\\"8b109f7e-b539-4abd-be49-fd9e033b3339\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:34:04Z is after 2025-08-24T17:21:41Z" Oct 02 09:34:04 crc kubenswrapper[4722]: E1002 09:34:04.152979 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.155070 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.155111 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.155124 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.155143 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.155156 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:04Z","lastTransitionTime":"2025-10-02T09:34:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.257520 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.257571 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.257583 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.257602 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.257616 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:04Z","lastTransitionTime":"2025-10-02T09:34:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.360778 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.360894 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.360913 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.360937 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.360976 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:04Z","lastTransitionTime":"2025-10-02T09:34:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.463651 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.463733 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.463748 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.463793 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.463812 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:04Z","lastTransitionTime":"2025-10-02T09:34:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.571849 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.571914 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.571929 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.571962 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.571985 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:04Z","lastTransitionTime":"2025-10-02T09:34:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.676731 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.676783 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.676793 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.676810 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.676821 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:04Z","lastTransitionTime":"2025-10-02T09:34:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.710595 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.710697 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:34:04 crc kubenswrapper[4722]: E1002 09:34:04.710928 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:34:04 crc kubenswrapper[4722]: E1002 09:34:04.710740 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.711053 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.711199 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:34:04 crc kubenswrapper[4722]: E1002 09:34:04.711409 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:34:04 crc kubenswrapper[4722]: E1002 09:34:04.711888 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.730815 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a27c33baca44df058fd006a35ff882b94f64446256b26a7bcde4c22671a5ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:34:04Z is after 2025-08-24T17:21:41Z" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.749554 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:34:04Z is after 2025-08-24T17:21:41Z" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.763757 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f662826c-e772-4f56-a85f-83971aaef356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b262eb3201301d5e7cc940314a920dd22bcac685c0d91f412f27647028c7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc66c4529a0875a210fbc2b5af96db36ab208d651068958075de46898c1e150\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q6h76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:34:04Z is after 2025-08-24T17:21:41Z" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.780418 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.780525 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.780544 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.780583 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.780610 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:04Z","lastTransitionTime":"2025-10-02T09:34:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.780796 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tsdt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb2b1d65-2472-468b-8995-eb1943a63a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e98c56ccc33119647035cca69beb3a8def018e7bb0cae463c6910e18e0f64a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tsdt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:34:04Z is after 2025-08-24T17:21:41Z" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.792875 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1756cb2-0822-4b79-b527-99dad76e6b19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4650b87cfdd4d4075b3c3c3a7417d8ae6c01222820fa783143fcf963904a9215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247e21bc25ecb430b611fbc0d7260465b80365a76340c4d89652a0e0affdc39a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09276f774f0a48df9df5b99cce34366b7113c49a0c9e8e269996127d93c6015d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a875abefffed9954368ae8e08d3101ae90708e53f82e7eaa447d6164e41a446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a875abefffed9954368ae8e08d3101ae90708e53f82e7eaa447d6164e41a446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:34:04Z is after 2025-08-24T17:21:41Z" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.805870 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dec5204-e2c5-43eb-8b80-80b74c9aa806\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beaa5e380cbd961e9e06c7aad0f3ff2f42c43eab5e89b6cf779003c490baf018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1be9f4d5574e381c95d2f7e44f20e96db3e586f66a380f79ab89422112bbfb35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1be9f4d5574e381c95d2f7e44f20e96db3e586f66a380f79ab89422112bbfb35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:34:04Z is after 2025-08-24T17:21:41Z" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.821509 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://059f5c846149b9005d20124d666d2964bccada1c171cb57e17c64555147b0ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ee5e5f4f56ae11d8f714c42b1c1e6044b04fa5cf4c353d6bfd08c9f3b9e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:34:04Z is after 2025-08-24T17:21:41Z" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.836849 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a56e3a0-fea1-4ca5-9d2a-b0a28e16d684\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T09:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://737b3f37862e455f181b6864cab54bb6dae5c458ed9dcb9b7fda63a06d0f17fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb6cb31ce28cc0d6095e7f27187bdeb1c2ee2d3f19ac480c95ab8116bd776bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01dd91984bea279379c038d2fbb4a4ad45200b49443071e290c0bf6854af1ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df1cccd0fca2c109ddcdd7a8dbbdeda8524b30c01769c733e4ac6ffb6fde621d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee1a516ec5ce7fa70fb097f9604c192afc0e8dfb0beff7ac93b0efb006d69e02\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T09:32:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 09:32:46.368533 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 09:32:46.368793 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 09:32:46.369600 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1554788403/tls.crt::/tmp/serving-cert-1554788403/tls.key\\\\\\\"\\\\nI1002 09:32:46.682405 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 09:32:46.685049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 09:32:46.685073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 09:32:46.685132 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 09:32:46.685139 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 09:32:46.690261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 09:32:46.690286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 09:32:46.690295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 09:32:46.690299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 09:32:46.690303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 09:32:46.690306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 09:32:46.690478 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 09:32:46.692277 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:33:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3b1c06b43a4fb138588fe71ad3f37dae41d5de647684ab490a2f4079018ac06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T09:32:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349778e37a4ba2bdcce4b320095681560eeff1c4799d51cbfd7c666239c61182\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T09:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T09:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T09:32:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T09:34:04Z is after 2025-08-24T17:21:41Z" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.858808 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-6hm66" podStartSLOduration=77.858789367 podStartE2EDuration="1m17.858789367s" podCreationTimestamp="2025-10-02 09:32:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:04.858088808 +0000 UTC m=+100.894841788" watchObservedRunningTime="2025-10-02 09:34:04.858789367 +0000 UTC m=+100.895542347" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.873209 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-zwlns" podStartSLOduration=77.873175095 podStartE2EDuration="1m17.873175095s" podCreationTimestamp="2025-10-02 09:32:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:04.873120824 +0000 UTC m=+100.909873824" watchObservedRunningTime="2025-10-02 09:34:04.873175095 +0000 UTC m=+100.909928065" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.886012 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.886075 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.886089 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.886111 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.886125 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:04Z","lastTransitionTime":"2025-10-02T09:34:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.904166 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=75.904146551 podStartE2EDuration="1m15.904146551s" podCreationTimestamp="2025-10-02 09:32:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:04.904053198 +0000 UTC m=+100.940806188" watchObservedRunningTime="2025-10-02 09:34:04.904146551 +0000 UTC m=+100.940899591" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.936185 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sjg5q" podStartSLOduration=76.936164424 podStartE2EDuration="1m16.936164424s" podCreationTimestamp="2025-10-02 09:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:04.935979559 +0000 UTC m=+100.972732539" watchObservedRunningTime="2025-10-02 09:34:04.936164424 +0000 UTC m=+100.972917394" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.989010 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-cc2g9" podStartSLOduration=77.988981949 podStartE2EDuration="1m17.988981949s" podCreationTimestamp="2025-10-02 09:32:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:04.988028614 +0000 UTC m=+101.024781594" watchObservedRunningTime="2025-10-02 09:34:04.988981949 +0000 UTC m=+101.025734949" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.989647 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.989690 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.989704 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.989733 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:04 crc kubenswrapper[4722]: I1002 09:34:04.989745 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:04Z","lastTransitionTime":"2025-10-02T09:34:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:05 crc kubenswrapper[4722]: I1002 09:34:05.054717 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=16.054697052 podStartE2EDuration="16.054697052s" podCreationTimestamp="2025-10-02 09:33:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:05.035877484 +0000 UTC m=+101.072630474" watchObservedRunningTime="2025-10-02 09:34:05.054697052 +0000 UTC m=+101.091450032" Oct 02 09:34:05 crc kubenswrapper[4722]: I1002 09:34:05.092989 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:05 crc kubenswrapper[4722]: I1002 09:34:05.093043 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:05 crc kubenswrapper[4722]: I1002 09:34:05.093052 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:05 crc kubenswrapper[4722]: I1002 09:34:05.093072 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:05 crc kubenswrapper[4722]: I1002 09:34:05.093086 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:05Z","lastTransitionTime":"2025-10-02T09:34:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:05 crc kubenswrapper[4722]: I1002 09:34:05.196034 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:05 crc kubenswrapper[4722]: I1002 09:34:05.196090 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:05 crc kubenswrapper[4722]: I1002 09:34:05.196107 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:05 crc kubenswrapper[4722]: I1002 09:34:05.196137 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:05 crc kubenswrapper[4722]: I1002 09:34:05.196151 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:05Z","lastTransitionTime":"2025-10-02T09:34:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:05 crc kubenswrapper[4722]: I1002 09:34:05.300664 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:05 crc kubenswrapper[4722]: I1002 09:34:05.300750 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:05 crc kubenswrapper[4722]: I1002 09:34:05.300767 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:05 crc kubenswrapper[4722]: I1002 09:34:05.300789 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:05 crc kubenswrapper[4722]: I1002 09:34:05.300802 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:05Z","lastTransitionTime":"2025-10-02T09:34:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:05 crc kubenswrapper[4722]: I1002 09:34:05.402768 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:05 crc kubenswrapper[4722]: I1002 09:34:05.402812 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:05 crc kubenswrapper[4722]: I1002 09:34:05.402824 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:05 crc kubenswrapper[4722]: I1002 09:34:05.402840 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:05 crc kubenswrapper[4722]: I1002 09:34:05.402852 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:05Z","lastTransitionTime":"2025-10-02T09:34:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:05 crc kubenswrapper[4722]: I1002 09:34:05.505277 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:05 crc kubenswrapper[4722]: I1002 09:34:05.505401 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:05 crc kubenswrapper[4722]: I1002 09:34:05.505422 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:05 crc kubenswrapper[4722]: I1002 09:34:05.505452 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:05 crc kubenswrapper[4722]: I1002 09:34:05.505475 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:05Z","lastTransitionTime":"2025-10-02T09:34:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:05 crc kubenswrapper[4722]: I1002 09:34:05.608078 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:05 crc kubenswrapper[4722]: I1002 09:34:05.608145 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:05 crc kubenswrapper[4722]: I1002 09:34:05.608157 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:05 crc kubenswrapper[4722]: I1002 09:34:05.608184 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:05 crc kubenswrapper[4722]: I1002 09:34:05.608199 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:05Z","lastTransitionTime":"2025-10-02T09:34:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:05 crc kubenswrapper[4722]: I1002 09:34:05.711438 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:05 crc kubenswrapper[4722]: I1002 09:34:05.711495 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:05 crc kubenswrapper[4722]: I1002 09:34:05.711507 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:05 crc kubenswrapper[4722]: I1002 09:34:05.711530 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:05 crc kubenswrapper[4722]: I1002 09:34:05.711542 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:05Z","lastTransitionTime":"2025-10-02T09:34:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:05 crc kubenswrapper[4722]: I1002 09:34:05.814285 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:05 crc kubenswrapper[4722]: I1002 09:34:05.814401 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:05 crc kubenswrapper[4722]: I1002 09:34:05.814418 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:05 crc kubenswrapper[4722]: I1002 09:34:05.814447 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:05 crc kubenswrapper[4722]: I1002 09:34:05.814463 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:05Z","lastTransitionTime":"2025-10-02T09:34:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:05 crc kubenswrapper[4722]: I1002 09:34:05.918452 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:05 crc kubenswrapper[4722]: I1002 09:34:05.918551 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:05 crc kubenswrapper[4722]: I1002 09:34:05.918566 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:05 crc kubenswrapper[4722]: I1002 09:34:05.918596 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:05 crc kubenswrapper[4722]: I1002 09:34:05.918617 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:05Z","lastTransitionTime":"2025-10-02T09:34:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.022004 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.022083 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.022106 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.022141 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.022163 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:06Z","lastTransitionTime":"2025-10-02T09:34:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.125643 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.125685 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.125693 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.125707 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.125716 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:06Z","lastTransitionTime":"2025-10-02T09:34:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.228565 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.228641 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.228657 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.228680 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.228695 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:06Z","lastTransitionTime":"2025-10-02T09:34:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.252890 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3d43b03f-1f3f-4824-9eda-93e405f6ee9e-metrics-certs\") pod \"network-metrics-daemon-jn9jw\" (UID: \"3d43b03f-1f3f-4824-9eda-93e405f6ee9e\") " pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:34:06 crc kubenswrapper[4722]: E1002 09:34:06.253182 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 09:34:06 crc kubenswrapper[4722]: E1002 09:34:06.253349 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d43b03f-1f3f-4824-9eda-93e405f6ee9e-metrics-certs podName:3d43b03f-1f3f-4824-9eda-93e405f6ee9e nodeName:}" failed. No retries permitted until 2025-10-02 09:35:10.253286418 +0000 UTC m=+166.290039588 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3d43b03f-1f3f-4824-9eda-93e405f6ee9e-metrics-certs") pod "network-metrics-daemon-jn9jw" (UID: "3d43b03f-1f3f-4824-9eda-93e405f6ee9e") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.331660 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.331720 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.331732 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.331756 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.331770 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:06Z","lastTransitionTime":"2025-10-02T09:34:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.435053 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.435122 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.435138 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.435162 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.435178 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:06Z","lastTransitionTime":"2025-10-02T09:34:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.543214 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.543273 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.543289 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.543312 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.543390 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:06Z","lastTransitionTime":"2025-10-02T09:34:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.646895 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.646936 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.646950 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.646973 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.646988 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:06Z","lastTransitionTime":"2025-10-02T09:34:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.709852 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.709905 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.709873 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:34:06 crc kubenswrapper[4722]: E1002 09:34:06.710038 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.710068 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:34:06 crc kubenswrapper[4722]: E1002 09:34:06.710156 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:34:06 crc kubenswrapper[4722]: E1002 09:34:06.710252 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:34:06 crc kubenswrapper[4722]: E1002 09:34:06.710299 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.750476 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.750531 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.750543 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.750563 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.750576 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:06Z","lastTransitionTime":"2025-10-02T09:34:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.853629 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.853710 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.853728 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.853771 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.853790 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:06Z","lastTransitionTime":"2025-10-02T09:34:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.964765 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.964837 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.964868 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.964902 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:06 crc kubenswrapper[4722]: I1002 09:34:06.964928 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:06Z","lastTransitionTime":"2025-10-02T09:34:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.067424 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.067460 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.067471 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.067488 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.067501 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:07Z","lastTransitionTime":"2025-10-02T09:34:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.170179 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.170214 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.170240 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.170258 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.170267 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:07Z","lastTransitionTime":"2025-10-02T09:34:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.273037 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.273079 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.273087 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.273104 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.273116 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:07Z","lastTransitionTime":"2025-10-02T09:34:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.376663 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.376717 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.376735 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.376753 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.376766 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:07Z","lastTransitionTime":"2025-10-02T09:34:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.479710 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.479765 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.479778 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.479797 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.479812 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:07Z","lastTransitionTime":"2025-10-02T09:34:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.584092 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.584160 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.584175 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.584197 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.584214 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:07Z","lastTransitionTime":"2025-10-02T09:34:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.687354 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.687412 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.687426 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.687448 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.687461 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:07Z","lastTransitionTime":"2025-10-02T09:34:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.790591 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.790675 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.790687 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.790714 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.790727 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:07Z","lastTransitionTime":"2025-10-02T09:34:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.894051 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.894091 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.894101 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.894118 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.894127 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:07Z","lastTransitionTime":"2025-10-02T09:34:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.997294 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.997416 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.997428 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.997452 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:07 crc kubenswrapper[4722]: I1002 09:34:07.997465 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:07Z","lastTransitionTime":"2025-10-02T09:34:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.100343 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.100430 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.100449 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.100474 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.100488 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:08Z","lastTransitionTime":"2025-10-02T09:34:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.203019 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.203066 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.203076 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.203091 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.203103 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:08Z","lastTransitionTime":"2025-10-02T09:34:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.306054 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.306132 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.306146 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.306174 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.306190 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:08Z","lastTransitionTime":"2025-10-02T09:34:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.408451 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.408496 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.408507 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.408527 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.408538 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:08Z","lastTransitionTime":"2025-10-02T09:34:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.512627 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.512669 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.512683 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.512701 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.512712 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:08Z","lastTransitionTime":"2025-10-02T09:34:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.615190 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.615263 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.615280 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.615308 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.615350 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:08Z","lastTransitionTime":"2025-10-02T09:34:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.710132 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.710223 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.710245 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.710271 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:34:08 crc kubenswrapper[4722]: E1002 09:34:08.710336 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:34:08 crc kubenswrapper[4722]: E1002 09:34:08.710679 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:34:08 crc kubenswrapper[4722]: E1002 09:34:08.710735 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:34:08 crc kubenswrapper[4722]: E1002 09:34:08.710819 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.717172 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.717222 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.717232 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.717247 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.717261 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:08Z","lastTransitionTime":"2025-10-02T09:34:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.821726 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.821808 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.821822 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.821849 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.821863 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:08Z","lastTransitionTime":"2025-10-02T09:34:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.924863 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.924944 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.924960 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.924986 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:08 crc kubenswrapper[4722]: I1002 09:34:08.925003 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:08Z","lastTransitionTime":"2025-10-02T09:34:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.027967 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.028026 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.028040 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.028063 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.028078 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:09Z","lastTransitionTime":"2025-10-02T09:34:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.130843 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.130880 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.130891 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.130957 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.130971 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:09Z","lastTransitionTime":"2025-10-02T09:34:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.234174 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.234255 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.234269 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.234293 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.234308 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:09Z","lastTransitionTime":"2025-10-02T09:34:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.338015 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.338077 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.338093 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.338114 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.338127 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:09Z","lastTransitionTime":"2025-10-02T09:34:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.441470 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.441542 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.441557 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.441585 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.441601 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:09Z","lastTransitionTime":"2025-10-02T09:34:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.545104 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.545159 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.545170 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.545187 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.545196 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:09Z","lastTransitionTime":"2025-10-02T09:34:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.648337 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.648380 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.648390 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.648405 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.648415 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:09Z","lastTransitionTime":"2025-10-02T09:34:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.751608 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.751664 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.751676 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.751694 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.751705 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:09Z","lastTransitionTime":"2025-10-02T09:34:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.854211 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.854271 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.854286 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.854307 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.854348 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:09Z","lastTransitionTime":"2025-10-02T09:34:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.957863 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.957908 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.957917 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.957934 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:09 crc kubenswrapper[4722]: I1002 09:34:09.957944 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:09Z","lastTransitionTime":"2025-10-02T09:34:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.061541 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.061598 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.061609 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.061628 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.061638 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:10Z","lastTransitionTime":"2025-10-02T09:34:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.164625 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.164658 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.164667 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.164683 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.164692 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:10Z","lastTransitionTime":"2025-10-02T09:34:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.266727 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.266782 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.266801 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.266823 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.266838 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:10Z","lastTransitionTime":"2025-10-02T09:34:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.370627 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.371145 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.371291 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.371539 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.371703 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:10Z","lastTransitionTime":"2025-10-02T09:34:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.474640 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.474687 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.474699 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.474717 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.474731 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:10Z","lastTransitionTime":"2025-10-02T09:34:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.578412 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.578477 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.578492 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.578512 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.578525 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:10Z","lastTransitionTime":"2025-10-02T09:34:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.681637 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.681673 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.681681 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.681696 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.681705 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:10Z","lastTransitionTime":"2025-10-02T09:34:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.709587 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.709720 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.709816 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:34:10 crc kubenswrapper[4722]: E1002 09:34:10.709805 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.709838 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:34:10 crc kubenswrapper[4722]: E1002 09:34:10.711730 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:34:10 crc kubenswrapper[4722]: E1002 09:34:10.712063 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:34:10 crc kubenswrapper[4722]: E1002 09:34:10.712294 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.784445 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.784502 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.784512 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.784527 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.784535 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:10Z","lastTransitionTime":"2025-10-02T09:34:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.887828 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.887885 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.887896 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.887913 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.887923 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:10Z","lastTransitionTime":"2025-10-02T09:34:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.990408 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.990453 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.990463 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.990479 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:10 crc kubenswrapper[4722]: I1002 09:34:10.990492 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:10Z","lastTransitionTime":"2025-10-02T09:34:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:11 crc kubenswrapper[4722]: I1002 09:34:11.093859 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:11 crc kubenswrapper[4722]: I1002 09:34:11.093904 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:11 crc kubenswrapper[4722]: I1002 09:34:11.093915 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:11 crc kubenswrapper[4722]: I1002 09:34:11.093934 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:11 crc kubenswrapper[4722]: I1002 09:34:11.093945 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:11Z","lastTransitionTime":"2025-10-02T09:34:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:11 crc kubenswrapper[4722]: I1002 09:34:11.196438 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:11 crc kubenswrapper[4722]: I1002 09:34:11.196506 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:11 crc kubenswrapper[4722]: I1002 09:34:11.196521 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:11 crc kubenswrapper[4722]: I1002 09:34:11.196544 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:11 crc kubenswrapper[4722]: I1002 09:34:11.196558 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:11Z","lastTransitionTime":"2025-10-02T09:34:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:11 crc kubenswrapper[4722]: I1002 09:34:11.298991 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:11 crc kubenswrapper[4722]: I1002 09:34:11.299043 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:11 crc kubenswrapper[4722]: I1002 09:34:11.299062 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:11 crc kubenswrapper[4722]: I1002 09:34:11.299088 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:11 crc kubenswrapper[4722]: I1002 09:34:11.299104 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:11Z","lastTransitionTime":"2025-10-02T09:34:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:11 crc kubenswrapper[4722]: I1002 09:34:11.401265 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:11 crc kubenswrapper[4722]: I1002 09:34:11.401340 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:11 crc kubenswrapper[4722]: I1002 09:34:11.401376 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:11 crc kubenswrapper[4722]: I1002 09:34:11.401399 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:11 crc kubenswrapper[4722]: I1002 09:34:11.401411 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:11Z","lastTransitionTime":"2025-10-02T09:34:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:11 crc kubenswrapper[4722]: I1002 09:34:11.504481 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:11 crc kubenswrapper[4722]: I1002 09:34:11.504537 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:11 crc kubenswrapper[4722]: I1002 09:34:11.504550 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:11 crc kubenswrapper[4722]: I1002 09:34:11.504569 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:11 crc kubenswrapper[4722]: I1002 09:34:11.504583 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:11Z","lastTransitionTime":"2025-10-02T09:34:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:11 crc kubenswrapper[4722]: I1002 09:34:11.607103 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:11 crc kubenswrapper[4722]: I1002 09:34:11.607149 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:11 crc kubenswrapper[4722]: I1002 09:34:11.607159 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:11 crc kubenswrapper[4722]: I1002 09:34:11.607179 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:11 crc kubenswrapper[4722]: I1002 09:34:11.607192 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:11Z","lastTransitionTime":"2025-10-02T09:34:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:11 crc kubenswrapper[4722]: I1002 09:34:11.710568 4722 scope.go:117] "RemoveContainer" containerID="523bd1dd360fb1d2b7a23339cf39f643b7dbd12a237ec876e659b1a174797d90" Oct 02 09:34:11 crc kubenswrapper[4722]: I1002 09:34:11.710819 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:11 crc kubenswrapper[4722]: I1002 09:34:11.710886 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:11 crc kubenswrapper[4722]: I1002 09:34:11.710910 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:11 crc kubenswrapper[4722]: E1002 09:34:11.710922 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jlvpl_openshift-ovn-kubernetes(a900403a-9c51-4ca5-862f-2fbbd30cbcc3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" Oct 02 09:34:11 crc kubenswrapper[4722]: I1002 09:34:11.710978 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:11 crc kubenswrapper[4722]: I1002 09:34:11.711006 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:11Z","lastTransitionTime":"2025-10-02T09:34:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:11 crc kubenswrapper[4722]: I1002 09:34:11.814933 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:11 crc kubenswrapper[4722]: I1002 09:34:11.815003 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:11 crc kubenswrapper[4722]: I1002 09:34:11.815033 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:11 crc kubenswrapper[4722]: I1002 09:34:11.815069 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:11 crc kubenswrapper[4722]: I1002 09:34:11.815095 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:11Z","lastTransitionTime":"2025-10-02T09:34:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:11 crc kubenswrapper[4722]: I1002 09:34:11.919199 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:11 crc kubenswrapper[4722]: I1002 09:34:11.919251 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:11 crc kubenswrapper[4722]: I1002 09:34:11.919269 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:11 crc kubenswrapper[4722]: I1002 09:34:11.919297 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:11 crc kubenswrapper[4722]: I1002 09:34:11.919349 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:11Z","lastTransitionTime":"2025-10-02T09:34:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.022424 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.022469 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.022478 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.022547 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.022561 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:12Z","lastTransitionTime":"2025-10-02T09:34:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.125419 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.125463 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.125476 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.125494 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.125504 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:12Z","lastTransitionTime":"2025-10-02T09:34:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.228882 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.228940 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.228953 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.228971 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.228986 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:12Z","lastTransitionTime":"2025-10-02T09:34:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.333203 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.333286 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.333312 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.333396 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.333427 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:12Z","lastTransitionTime":"2025-10-02T09:34:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.436791 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.436871 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.436903 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.436942 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.436967 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:12Z","lastTransitionTime":"2025-10-02T09:34:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.540196 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.540254 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.540267 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.540289 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.540300 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:12Z","lastTransitionTime":"2025-10-02T09:34:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.644006 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.644078 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.644098 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.644128 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.644149 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:12Z","lastTransitionTime":"2025-10-02T09:34:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.710393 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.710448 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.710390 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:34:12 crc kubenswrapper[4722]: E1002 09:34:12.710640 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:34:12 crc kubenswrapper[4722]: E1002 09:34:12.710936 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.711498 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:34:12 crc kubenswrapper[4722]: E1002 09:34:12.711523 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:34:12 crc kubenswrapper[4722]: E1002 09:34:12.711640 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.747199 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.747258 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.747277 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.747302 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.747344 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:12Z","lastTransitionTime":"2025-10-02T09:34:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.850704 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.850864 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.850893 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.850925 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.850946 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:12Z","lastTransitionTime":"2025-10-02T09:34:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.954292 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.954375 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.954388 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.954414 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:12 crc kubenswrapper[4722]: I1002 09:34:12.954427 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:12Z","lastTransitionTime":"2025-10-02T09:34:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.057714 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.057764 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.057772 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.057792 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.057803 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:13Z","lastTransitionTime":"2025-10-02T09:34:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.161416 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.161502 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.161516 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.161546 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.161564 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:13Z","lastTransitionTime":"2025-10-02T09:34:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.265106 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.265187 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.265210 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.265240 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.265259 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:13Z","lastTransitionTime":"2025-10-02T09:34:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.369096 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.369172 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.369223 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.369240 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.369250 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:13Z","lastTransitionTime":"2025-10-02T09:34:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.472708 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.472776 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.472801 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.472844 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.472879 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:13Z","lastTransitionTime":"2025-10-02T09:34:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.576158 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.576244 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.576266 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.576300 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.576366 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:13Z","lastTransitionTime":"2025-10-02T09:34:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.679209 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.679254 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.679267 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.679284 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.679297 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:13Z","lastTransitionTime":"2025-10-02T09:34:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.781959 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.782030 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.782045 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.782066 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.782080 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:13Z","lastTransitionTime":"2025-10-02T09:34:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.884825 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.884859 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.884868 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.884882 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.884891 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:13Z","lastTransitionTime":"2025-10-02T09:34:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.988455 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.988483 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.988494 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.988511 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:13 crc kubenswrapper[4722]: I1002 09:34:13.988524 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:13Z","lastTransitionTime":"2025-10-02T09:34:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.090805 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.090858 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.090869 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.090883 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.090894 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:14Z","lastTransitionTime":"2025-10-02T09:34:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.193669 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.193708 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.193716 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.193730 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.193743 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:14Z","lastTransitionTime":"2025-10-02T09:34:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.296810 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.296887 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.296900 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.296925 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.296943 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:14Z","lastTransitionTime":"2025-10-02T09:34:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.400092 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.400164 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.400181 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.400211 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.400232 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:14Z","lastTransitionTime":"2025-10-02T09:34:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.488036 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.488104 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.488116 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.488170 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.488187 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T09:34:14Z","lastTransitionTime":"2025-10-02T09:34:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.538269 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-qmsbt"] Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.538756 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qmsbt" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.541857 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.541886 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.541864 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.542242 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.581971 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/78951a4d-9a50-415d-9ba9-bb7cb39c6918-service-ca\") pod \"cluster-version-operator-5c965bbfc6-qmsbt\" (UID: \"78951a4d-9a50-415d-9ba9-bb7cb39c6918\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qmsbt" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.582011 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78951a4d-9a50-415d-9ba9-bb7cb39c6918-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-qmsbt\" (UID: \"78951a4d-9a50-415d-9ba9-bb7cb39c6918\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qmsbt" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.582047 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78951a4d-9a50-415d-9ba9-bb7cb39c6918-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-qmsbt\" (UID: \"78951a4d-9a50-415d-9ba9-bb7cb39c6918\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qmsbt" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.582061 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/78951a4d-9a50-415d-9ba9-bb7cb39c6918-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-qmsbt\" (UID: \"78951a4d-9a50-415d-9ba9-bb7cb39c6918\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qmsbt" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.582100 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/78951a4d-9a50-415d-9ba9-bb7cb39c6918-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-qmsbt\" (UID: \"78951a4d-9a50-415d-9ba9-bb7cb39c6918\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qmsbt" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.589258 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-tsdt2" podStartSLOduration=87.589239803 podStartE2EDuration="1m27.589239803s" podCreationTimestamp="2025-10-02 09:32:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:14.588872134 +0000 UTC m=+110.625625134" watchObservedRunningTime="2025-10-02 09:34:14.589239803 +0000 UTC m=+110.625992783" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.590271 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" podStartSLOduration=87.590259651 podStartE2EDuration="1m27.590259651s" podCreationTimestamp="2025-10-02 09:32:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:14.571935207 +0000 UTC m=+110.608688197" watchObservedRunningTime="2025-10-02 09:34:14.590259651 +0000 UTC m=+110.627012631" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.607891 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=58.607871196 podStartE2EDuration="58.607871196s" podCreationTimestamp="2025-10-02 09:33:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:14.606556931 +0000 UTC m=+110.643309911" watchObservedRunningTime="2025-10-02 09:34:14.607871196 +0000 UTC m=+110.644624176" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.618073 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=44.618058291 podStartE2EDuration="44.618058291s" podCreationTimestamp="2025-10-02 09:33:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:14.61766233 +0000 UTC m=+110.654415310" watchObservedRunningTime="2025-10-02 09:34:14.618058291 +0000 UTC m=+110.654811271" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.665330 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=88.665284865 podStartE2EDuration="1m28.665284865s" podCreationTimestamp="2025-10-02 09:32:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:14.664394911 +0000 UTC m=+110.701147921" watchObservedRunningTime="2025-10-02 09:34:14.665284865 +0000 UTC m=+110.702037845" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.682779 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/78951a4d-9a50-415d-9ba9-bb7cb39c6918-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-qmsbt\" (UID: \"78951a4d-9a50-415d-9ba9-bb7cb39c6918\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qmsbt" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.682866 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/78951a4d-9a50-415d-9ba9-bb7cb39c6918-service-ca\") pod \"cluster-version-operator-5c965bbfc6-qmsbt\" (UID: \"78951a4d-9a50-415d-9ba9-bb7cb39c6918\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qmsbt" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.682897 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78951a4d-9a50-415d-9ba9-bb7cb39c6918-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-qmsbt\" (UID: \"78951a4d-9a50-415d-9ba9-bb7cb39c6918\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qmsbt" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.682921 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/78951a4d-9a50-415d-9ba9-bb7cb39c6918-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-qmsbt\" (UID: \"78951a4d-9a50-415d-9ba9-bb7cb39c6918\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qmsbt" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.682951 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78951a4d-9a50-415d-9ba9-bb7cb39c6918-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-qmsbt\" (UID: \"78951a4d-9a50-415d-9ba9-bb7cb39c6918\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qmsbt" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.683032 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/78951a4d-9a50-415d-9ba9-bb7cb39c6918-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-qmsbt\" (UID: \"78951a4d-9a50-415d-9ba9-bb7cb39c6918\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qmsbt" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.683166 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/78951a4d-9a50-415d-9ba9-bb7cb39c6918-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-qmsbt\" (UID: \"78951a4d-9a50-415d-9ba9-bb7cb39c6918\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qmsbt" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.684052 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/78951a4d-9a50-415d-9ba9-bb7cb39c6918-service-ca\") pod \"cluster-version-operator-5c965bbfc6-qmsbt\" (UID: \"78951a4d-9a50-415d-9ba9-bb7cb39c6918\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qmsbt" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.699291 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78951a4d-9a50-415d-9ba9-bb7cb39c6918-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-qmsbt\" (UID: \"78951a4d-9a50-415d-9ba9-bb7cb39c6918\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qmsbt" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.711150 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78951a4d-9a50-415d-9ba9-bb7cb39c6918-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-qmsbt\" (UID: \"78951a4d-9a50-415d-9ba9-bb7cb39c6918\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qmsbt" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.711351 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.711474 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:34:14 crc kubenswrapper[4722]: E1002 09:34:14.711942 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.712059 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:34:14 crc kubenswrapper[4722]: E1002 09:34:14.712452 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:34:14 crc kubenswrapper[4722]: E1002 09:34:14.712194 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.712079 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:34:14 crc kubenswrapper[4722]: E1002 09:34:14.712810 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:34:14 crc kubenswrapper[4722]: I1002 09:34:14.863502 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qmsbt" Oct 02 09:34:15 crc kubenswrapper[4722]: I1002 09:34:15.287390 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qmsbt" event={"ID":"78951a4d-9a50-415d-9ba9-bb7cb39c6918","Type":"ContainerStarted","Data":"76ad13a03454a1d8da3e4cd4c1a7908e3475a02b74601ad4f5188853837d4f57"} Oct 02 09:34:15 crc kubenswrapper[4722]: I1002 09:34:15.287442 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qmsbt" event={"ID":"78951a4d-9a50-415d-9ba9-bb7cb39c6918","Type":"ContainerStarted","Data":"e175a5f8b57d31ab7d03ee500c601f6421550dd9ffb369795b34e0e66b68b8cb"} Oct 02 09:34:15 crc kubenswrapper[4722]: I1002 09:34:15.303217 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qmsbt" podStartSLOduration=88.303200886 podStartE2EDuration="1m28.303200886s" podCreationTimestamp="2025-10-02 09:32:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:15.302171498 +0000 UTC m=+111.338924478" watchObservedRunningTime="2025-10-02 09:34:15.303200886 +0000 UTC m=+111.339953866" Oct 02 09:34:16 crc kubenswrapper[4722]: I1002 09:34:16.709769 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:34:16 crc kubenswrapper[4722]: I1002 09:34:16.709890 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:34:16 crc kubenswrapper[4722]: I1002 09:34:16.709935 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:34:16 crc kubenswrapper[4722]: E1002 09:34:16.710425 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:34:16 crc kubenswrapper[4722]: E1002 09:34:16.710195 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:34:16 crc kubenswrapper[4722]: E1002 09:34:16.710541 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:34:16 crc kubenswrapper[4722]: I1002 09:34:16.710042 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:34:16 crc kubenswrapper[4722]: E1002 09:34:16.710658 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:34:18 crc kubenswrapper[4722]: I1002 09:34:18.709799 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:34:18 crc kubenswrapper[4722]: I1002 09:34:18.709828 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:34:18 crc kubenswrapper[4722]: I1002 09:34:18.709849 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:34:18 crc kubenswrapper[4722]: I1002 09:34:18.709982 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:34:18 crc kubenswrapper[4722]: E1002 09:34:18.710141 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:34:18 crc kubenswrapper[4722]: E1002 09:34:18.710227 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:34:18 crc kubenswrapper[4722]: E1002 09:34:18.710454 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:34:18 crc kubenswrapper[4722]: E1002 09:34:18.710726 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:34:20 crc kubenswrapper[4722]: I1002 09:34:20.710433 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:34:20 crc kubenswrapper[4722]: I1002 09:34:20.710434 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:34:20 crc kubenswrapper[4722]: I1002 09:34:20.710575 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:34:20 crc kubenswrapper[4722]: I1002 09:34:20.710608 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:34:20 crc kubenswrapper[4722]: E1002 09:34:20.710823 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:34:20 crc kubenswrapper[4722]: E1002 09:34:20.710885 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:34:20 crc kubenswrapper[4722]: E1002 09:34:20.710972 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:34:20 crc kubenswrapper[4722]: E1002 09:34:20.711363 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:34:22 crc kubenswrapper[4722]: I1002 09:34:22.709869 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:34:22 crc kubenswrapper[4722]: E1002 09:34:22.709999 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:34:22 crc kubenswrapper[4722]: I1002 09:34:22.710136 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:34:22 crc kubenswrapper[4722]: I1002 09:34:22.710168 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:34:22 crc kubenswrapper[4722]: I1002 09:34:22.710557 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:34:22 crc kubenswrapper[4722]: E1002 09:34:22.710549 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:34:22 crc kubenswrapper[4722]: E1002 09:34:22.710640 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:34:22 crc kubenswrapper[4722]: E1002 09:34:22.710744 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:34:24 crc kubenswrapper[4722]: I1002 09:34:24.318285 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zwlns_04c84986-75a1-4683-8764-8f7307d1126a/kube-multus/1.log" Oct 02 09:34:24 crc kubenswrapper[4722]: I1002 09:34:24.319047 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zwlns_04c84986-75a1-4683-8764-8f7307d1126a/kube-multus/0.log" Oct 02 09:34:24 crc kubenswrapper[4722]: I1002 09:34:24.319162 4722 generic.go:334] "Generic (PLEG): container finished" podID="04c84986-75a1-4683-8764-8f7307d1126a" containerID="aae2e270e209af7e6afbf96e995421ccb4fd0aa09be5011bb0458307a8d5788a" exitCode=1 Oct 02 09:34:24 crc kubenswrapper[4722]: I1002 09:34:24.319222 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zwlns" event={"ID":"04c84986-75a1-4683-8764-8f7307d1126a","Type":"ContainerDied","Data":"aae2e270e209af7e6afbf96e995421ccb4fd0aa09be5011bb0458307a8d5788a"} Oct 02 09:34:24 crc kubenswrapper[4722]: I1002 09:34:24.319346 4722 scope.go:117] "RemoveContainer" containerID="460204a345893172e65dd3415be30a96dd48d7bce44d05971133cde3e43939af" Oct 02 09:34:24 crc kubenswrapper[4722]: I1002 09:34:24.319934 4722 scope.go:117] "RemoveContainer" containerID="aae2e270e209af7e6afbf96e995421ccb4fd0aa09be5011bb0458307a8d5788a" Oct 02 09:34:24 crc kubenswrapper[4722]: E1002 09:34:24.320217 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-zwlns_openshift-multus(04c84986-75a1-4683-8764-8f7307d1126a)\"" pod="openshift-multus/multus-zwlns" podUID="04c84986-75a1-4683-8764-8f7307d1126a" Oct 02 09:34:24 crc kubenswrapper[4722]: I1002 09:34:24.711752 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:34:24 crc kubenswrapper[4722]: I1002 09:34:24.711775 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:34:24 crc kubenswrapper[4722]: I1002 09:34:24.711793 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:34:24 crc kubenswrapper[4722]: I1002 09:34:24.711803 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:34:24 crc kubenswrapper[4722]: I1002 09:34:24.712556 4722 scope.go:117] "RemoveContainer" containerID="523bd1dd360fb1d2b7a23339cf39f643b7dbd12a237ec876e659b1a174797d90" Oct 02 09:34:24 crc kubenswrapper[4722]: E1002 09:34:24.712756 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:34:24 crc kubenswrapper[4722]: E1002 09:34:24.712884 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:34:24 crc kubenswrapper[4722]: E1002 09:34:24.712964 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:34:24 crc kubenswrapper[4722]: E1002 09:34:24.713024 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:34:24 crc kubenswrapper[4722]: E1002 09:34:24.730166 4722 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 02 09:34:24 crc kubenswrapper[4722]: E1002 09:34:24.796575 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 09:34:25 crc kubenswrapper[4722]: I1002 09:34:25.326214 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jlvpl_a900403a-9c51-4ca5-862f-2fbbd30cbcc3/ovnkube-controller/3.log" Oct 02 09:34:25 crc kubenswrapper[4722]: I1002 09:34:25.329235 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" event={"ID":"a900403a-9c51-4ca5-862f-2fbbd30cbcc3","Type":"ContainerStarted","Data":"408cf3f42fa6f355e1374bebc394c4c99071375d6de36e14e374f7008790f204"} Oct 02 09:34:25 crc kubenswrapper[4722]: I1002 09:34:25.329642 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:34:25 crc kubenswrapper[4722]: I1002 09:34:25.330741 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zwlns_04c84986-75a1-4683-8764-8f7307d1126a/kube-multus/1.log" Oct 02 09:34:25 crc kubenswrapper[4722]: I1002 09:34:25.362482 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" podStartSLOduration=98.362457761 podStartE2EDuration="1m38.362457761s" podCreationTimestamp="2025-10-02 09:32:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:25.360290812 +0000 UTC m=+121.397043812" watchObservedRunningTime="2025-10-02 09:34:25.362457761 +0000 UTC m=+121.399210741" Oct 02 09:34:25 crc kubenswrapper[4722]: I1002 09:34:25.774626 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jn9jw"] Oct 02 09:34:25 crc kubenswrapper[4722]: I1002 09:34:25.774822 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:34:25 crc kubenswrapper[4722]: E1002 09:34:25.774961 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:34:26 crc kubenswrapper[4722]: I1002 09:34:26.709534 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:34:26 crc kubenswrapper[4722]: I1002 09:34:26.709595 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:34:26 crc kubenswrapper[4722]: I1002 09:34:26.709563 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:34:26 crc kubenswrapper[4722]: E1002 09:34:26.709813 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:34:26 crc kubenswrapper[4722]: E1002 09:34:26.709895 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:34:26 crc kubenswrapper[4722]: E1002 09:34:26.709988 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:34:27 crc kubenswrapper[4722]: I1002 09:34:27.710175 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:34:27 crc kubenswrapper[4722]: E1002 09:34:27.710657 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:34:28 crc kubenswrapper[4722]: I1002 09:34:28.709867 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:34:28 crc kubenswrapper[4722]: I1002 09:34:28.709934 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:34:28 crc kubenswrapper[4722]: I1002 09:34:28.709997 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:34:28 crc kubenswrapper[4722]: E1002 09:34:28.710123 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:34:28 crc kubenswrapper[4722]: E1002 09:34:28.710245 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:34:28 crc kubenswrapper[4722]: E1002 09:34:28.710475 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:34:29 crc kubenswrapper[4722]: I1002 09:34:29.709661 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:34:29 crc kubenswrapper[4722]: E1002 09:34:29.709858 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:34:29 crc kubenswrapper[4722]: E1002 09:34:29.798630 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 09:34:30 crc kubenswrapper[4722]: I1002 09:34:30.709566 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:34:30 crc kubenswrapper[4722]: I1002 09:34:30.709581 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:34:30 crc kubenswrapper[4722]: E1002 09:34:30.710081 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:34:30 crc kubenswrapper[4722]: I1002 09:34:30.709695 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:34:30 crc kubenswrapper[4722]: E1002 09:34:30.710295 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:34:30 crc kubenswrapper[4722]: E1002 09:34:30.710438 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:34:31 crc kubenswrapper[4722]: I1002 09:34:31.709568 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:34:31 crc kubenswrapper[4722]: E1002 09:34:31.709825 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:34:32 crc kubenswrapper[4722]: I1002 09:34:32.709830 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:34:32 crc kubenswrapper[4722]: I1002 09:34:32.709830 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:34:32 crc kubenswrapper[4722]: E1002 09:34:32.710012 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:34:32 crc kubenswrapper[4722]: I1002 09:34:32.709962 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:34:32 crc kubenswrapper[4722]: E1002 09:34:32.710162 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:34:32 crc kubenswrapper[4722]: E1002 09:34:32.710254 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:34:33 crc kubenswrapper[4722]: I1002 09:34:33.709222 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:34:33 crc kubenswrapper[4722]: E1002 09:34:33.709396 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:34:34 crc kubenswrapper[4722]: I1002 09:34:34.709587 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:34:34 crc kubenswrapper[4722]: I1002 09:34:34.709576 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:34:34 crc kubenswrapper[4722]: I1002 09:34:34.709605 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:34:34 crc kubenswrapper[4722]: E1002 09:34:34.710940 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:34:34 crc kubenswrapper[4722]: E1002 09:34:34.710737 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:34:34 crc kubenswrapper[4722]: E1002 09:34:34.711421 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:34:34 crc kubenswrapper[4722]: I1002 09:34:34.711884 4722 scope.go:117] "RemoveContainer" containerID="aae2e270e209af7e6afbf96e995421ccb4fd0aa09be5011bb0458307a8d5788a" Oct 02 09:34:34 crc kubenswrapper[4722]: E1002 09:34:34.799682 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 09:34:35 crc kubenswrapper[4722]: I1002 09:34:35.377075 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zwlns_04c84986-75a1-4683-8764-8f7307d1126a/kube-multus/1.log" Oct 02 09:34:35 crc kubenswrapper[4722]: I1002 09:34:35.377134 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zwlns" event={"ID":"04c84986-75a1-4683-8764-8f7307d1126a","Type":"ContainerStarted","Data":"3b7bf9f2844675ba4d9daf32117df4cf7541f82b5f6c4e42c2b03ddfb891ecda"} Oct 02 09:34:35 crc kubenswrapper[4722]: I1002 09:34:35.710173 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:34:35 crc kubenswrapper[4722]: E1002 09:34:35.710335 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:34:36 crc kubenswrapper[4722]: I1002 09:34:36.709684 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:34:36 crc kubenswrapper[4722]: I1002 09:34:36.709716 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:34:36 crc kubenswrapper[4722]: I1002 09:34:36.709764 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:34:36 crc kubenswrapper[4722]: E1002 09:34:36.709842 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:34:36 crc kubenswrapper[4722]: E1002 09:34:36.710013 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:34:36 crc kubenswrapper[4722]: E1002 09:34:36.710332 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:34:37 crc kubenswrapper[4722]: I1002 09:34:37.709775 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:34:37 crc kubenswrapper[4722]: E1002 09:34:37.709918 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:34:38 crc kubenswrapper[4722]: I1002 09:34:38.709972 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:34:38 crc kubenswrapper[4722]: I1002 09:34:38.710042 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:34:38 crc kubenswrapper[4722]: E1002 09:34:38.710105 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 09:34:38 crc kubenswrapper[4722]: E1002 09:34:38.710197 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 09:34:38 crc kubenswrapper[4722]: I1002 09:34:38.709981 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:34:38 crc kubenswrapper[4722]: E1002 09:34:38.710271 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 09:34:39 crc kubenswrapper[4722]: I1002 09:34:39.709795 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:34:39 crc kubenswrapper[4722]: E1002 09:34:39.710217 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9jw" podUID="3d43b03f-1f3f-4824-9eda-93e405f6ee9e" Oct 02 09:34:40 crc kubenswrapper[4722]: I1002 09:34:40.709704 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:34:40 crc kubenswrapper[4722]: I1002 09:34:40.709761 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:34:40 crc kubenswrapper[4722]: I1002 09:34:40.710370 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:34:40 crc kubenswrapper[4722]: I1002 09:34:40.711670 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 02 09:34:40 crc kubenswrapper[4722]: I1002 09:34:40.711869 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 02 09:34:40 crc kubenswrapper[4722]: I1002 09:34:40.712181 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 02 09:34:40 crc kubenswrapper[4722]: I1002 09:34:40.712252 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 02 09:34:41 crc kubenswrapper[4722]: I1002 09:34:41.709239 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:34:41 crc kubenswrapper[4722]: I1002 09:34:41.711805 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 02 09:34:41 crc kubenswrapper[4722]: I1002 09:34:41.711886 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 02 09:34:44 crc kubenswrapper[4722]: I1002 09:34:44.628034 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.011586 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.043716 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b9q7t"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.044426 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b9q7t" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.045894 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-jkt7j"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.046598 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-jkt7j" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.049360 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-fpttx"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.050305 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fpttx" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.052567 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.052571 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.053811 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.054296 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.055490 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.055677 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.055972 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.056303 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.056630 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.057288 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.057752 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.057766 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.057848 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.058113 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.065205 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.065287 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.065456 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.066680 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.066804 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.068235 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.068567 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z5zmm"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.069307 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z5zmm" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.069398 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zk6t7"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.069418 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.069477 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.069882 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.070250 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.070258 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zk6t7" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.072866 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-s727n"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.073513 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-s727n" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.078245 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.078634 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.078908 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.079040 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.079250 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.079367 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-krj4j"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.079949 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-krj4j" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.081138 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-pf6dm"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.081657 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pf6dm" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.084662 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.084862 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.085061 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.085494 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-clnnl"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.085568 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.085928 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ljxl7"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.086218 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.086308 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-clnnl" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.088137 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dgwxp"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.089023 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-flf5c"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.089681 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cr6pt"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.090272 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cr6pt" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.090535 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-mfhl4"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.090982 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-mfhl4" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.096894 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-flf5c" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.097151 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-dgwxp" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.097333 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.097626 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.099213 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.099482 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.099734 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.099919 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.100040 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.100083 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.100160 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.100174 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-r8rd5"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.100354 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.100937 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-r8rd5" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.103478 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-6xfhq"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.104022 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-6xfhq" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.106061 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.106234 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.107844 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kf5f5"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.108763 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.112523 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.112938 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.113330 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.113489 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.113622 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.113755 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.113993 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.114438 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.117256 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.117560 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.118108 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.118224 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.118407 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.118430 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.118566 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.118724 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.118803 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.118925 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.119167 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.119335 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.119475 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.119585 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.119762 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.120387 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.120518 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.119601 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.129365 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.118808 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.130340 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.130650 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.130789 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.136954 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.140515 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-954gs"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.142072 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.142793 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.142890 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.152521 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.152706 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.152800 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.152972 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.153159 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.156119 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.163957 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.164167 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.164405 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.153164 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.164556 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.164829 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.165119 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.167968 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d4656"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.168367 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b9q7t"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.168466 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d4656" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.168823 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.168977 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-954gs" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.169726 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-xpzp4"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.170196 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.170435 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-xpzp4" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.170778 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.171216 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxrnk\" (UniqueName: \"kubernetes.io/projected/75d348d9-43ce-41be-afdf-1a35627522c8-kube-api-access-mxrnk\") pod \"openshift-apiserver-operator-796bbdcf4f-b9q7t\" (UID: \"75d348d9-43ce-41be-afdf-1a35627522c8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b9q7t" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.171344 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ab29eb0-bca1-422c-a796-6c220bb76022-serving-cert\") pod \"console-operator-58897d9998-s727n\" (UID: \"5ab29eb0-bca1-422c-a796-6c220bb76022\") " pod="openshift-console-operator/console-operator-58897d9998-s727n" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.171449 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g2vg\" (UniqueName: \"kubernetes.io/projected/75b01f76-263d-4523-87e1-eb33d01164a7-kube-api-access-4g2vg\") pod \"machine-approver-56656f9798-flf5c\" (UID: \"75b01f76-263d-4523-87e1-eb33d01164a7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-flf5c" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.171528 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fghnh\" (UniqueName: \"kubernetes.io/projected/4104d0d5-52a2-47da-9cd0-86d4c52bc122-kube-api-access-fghnh\") pod \"route-controller-manager-6576b87f9c-pf6dm\" (UID: \"4104d0d5-52a2-47da-9cd0-86d4c52bc122\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pf6dm" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.171614 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ab29eb0-bca1-422c-a796-6c220bb76022-config\") pod \"console-operator-58897d9998-s727n\" (UID: \"5ab29eb0-bca1-422c-a796-6c220bb76022\") " pod="openshift-console-operator/console-operator-58897d9998-s727n" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.171733 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4104d0d5-52a2-47da-9cd0-86d4c52bc122-client-ca\") pod \"route-controller-manager-6576b87f9c-pf6dm\" (UID: \"4104d0d5-52a2-47da-9cd0-86d4c52bc122\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pf6dm" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.171827 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/75b01f76-263d-4523-87e1-eb33d01164a7-auth-proxy-config\") pod \"machine-approver-56656f9798-flf5c\" (UID: \"75b01f76-263d-4523-87e1-eb33d01164a7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-flf5c" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.171907 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75b01f76-263d-4523-87e1-eb33d01164a7-config\") pod \"machine-approver-56656f9798-flf5c\" (UID: \"75b01f76-263d-4523-87e1-eb33d01164a7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-flf5c" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.171984 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4104d0d5-52a2-47da-9cd0-86d4c52bc122-config\") pod \"route-controller-manager-6576b87f9c-pf6dm\" (UID: \"4104d0d5-52a2-47da-9cd0-86d4c52bc122\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pf6dm" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.172077 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5ab29eb0-bca1-422c-a796-6c220bb76022-trusted-ca\") pod \"console-operator-58897d9998-s727n\" (UID: \"5ab29eb0-bca1-422c-a796-6c220bb76022\") " pod="openshift-console-operator/console-operator-58897d9998-s727n" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.172154 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5zpt\" (UniqueName: \"kubernetes.io/projected/5ab29eb0-bca1-422c-a796-6c220bb76022-kube-api-access-q5zpt\") pod \"console-operator-58897d9998-s727n\" (UID: \"5ab29eb0-bca1-422c-a796-6c220bb76022\") " pod="openshift-console-operator/console-operator-58897d9998-s727n" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.172242 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4104d0d5-52a2-47da-9cd0-86d4c52bc122-serving-cert\") pod \"route-controller-manager-6576b87f9c-pf6dm\" (UID: \"4104d0d5-52a2-47da-9cd0-86d4c52bc122\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pf6dm" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.172389 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e5de8035-7427-4646-ad95-13c58ecad616-available-featuregates\") pod \"openshift-config-operator-7777fb866f-r8rd5\" (UID: \"e5de8035-7427-4646-ad95-13c58ecad616\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-r8rd5" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.172492 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/75b01f76-263d-4523-87e1-eb33d01164a7-machine-approver-tls\") pod \"machine-approver-56656f9798-flf5c\" (UID: \"75b01f76-263d-4523-87e1-eb33d01164a7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-flf5c" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.172569 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75d348d9-43ce-41be-afdf-1a35627522c8-config\") pod \"openshift-apiserver-operator-796bbdcf4f-b9q7t\" (UID: \"75d348d9-43ce-41be-afdf-1a35627522c8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b9q7t" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.172665 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6dvt\" (UniqueName: \"kubernetes.io/projected/e5de8035-7427-4646-ad95-13c58ecad616-kube-api-access-f6dvt\") pod \"openshift-config-operator-7777fb866f-r8rd5\" (UID: \"e5de8035-7427-4646-ad95-13c58ecad616\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-r8rd5" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.172754 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75d348d9-43ce-41be-afdf-1a35627522c8-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-b9q7t\" (UID: \"75d348d9-43ce-41be-afdf-1a35627522c8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b9q7t" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.172826 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5de8035-7427-4646-ad95-13c58ecad616-serving-cert\") pod \"openshift-config-operator-7777fb866f-r8rd5\" (UID: \"e5de8035-7427-4646-ad95-13c58ecad616\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-r8rd5" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.174981 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.175842 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.177790 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.178412 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.180731 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.182992 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.183270 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.183779 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.187370 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-42lsk"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.188062 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fm8qg"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.188331 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vftjt"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.188771 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nvrnd"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.188844 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vftjt" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.188901 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fm8qg" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.188925 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-42lsk" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.189625 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-45mp5"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.190051 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-45mp5" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.190122 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xkx2c"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.190248 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nvrnd" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.203777 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.206925 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-jkt7j"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.206976 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-nhcxl"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.208638 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nhcxl" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.209171 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-xkx2c" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.212205 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.213074 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-87568"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.217808 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-422gm"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.218230 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-87568" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.218519 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-422gm" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.222698 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qtsxk"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.223537 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qtsxk" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.225560 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-c5szj"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.226716 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-c5szj" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.227039 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323290-fh7nk"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.227555 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.228171 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323290-fh7nk" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.230598 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hpfrs"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.233743 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hpfrs" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.234587 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-clnnl"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.236896 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-7ghd2"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.238424 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b2tld"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.238632 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-7ghd2" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.239955 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kbnrw"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.240200 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b2tld" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.241744 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z5zmm"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.241934 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kbnrw" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.244891 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5bhmb"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.245861 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5bhmb" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.246811 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.250969 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-h5twj"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.252233 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-h5twj" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.252895 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zk6t7"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.254830 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ljxl7"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.255754 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-krj4j"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.256860 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-pf6dm"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.258153 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-r8rd5"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.258958 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-fpttx"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.259964 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kf5f5"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.260978 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-h645w"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.262962 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-s727n"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.263078 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-h645w" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.263128 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cr6pt"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.264389 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d4656"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.265786 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dgwxp"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.266752 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.268299 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-mfhl4"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.269955 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-954gs"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.271452 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-xpzp4"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.272688 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fm8qg"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.273566 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g2vg\" (UniqueName: \"kubernetes.io/projected/75b01f76-263d-4523-87e1-eb33d01164a7-kube-api-access-4g2vg\") pod \"machine-approver-56656f9798-flf5c\" (UID: \"75b01f76-263d-4523-87e1-eb33d01164a7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-flf5c" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.273614 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fghnh\" (UniqueName: \"kubernetes.io/projected/4104d0d5-52a2-47da-9cd0-86d4c52bc122-kube-api-access-fghnh\") pod \"route-controller-manager-6576b87f9c-pf6dm\" (UID: \"4104d0d5-52a2-47da-9cd0-86d4c52bc122\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pf6dm" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.273646 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ab29eb0-bca1-422c-a796-6c220bb76022-config\") pod \"console-operator-58897d9998-s727n\" (UID: \"5ab29eb0-bca1-422c-a796-6c220bb76022\") " pod="openshift-console-operator/console-operator-58897d9998-s727n" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.273673 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/75b01f76-263d-4523-87e1-eb33d01164a7-auth-proxy-config\") pod \"machine-approver-56656f9798-flf5c\" (UID: \"75b01f76-263d-4523-87e1-eb33d01164a7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-flf5c" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.273707 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4104d0d5-52a2-47da-9cd0-86d4c52bc122-client-ca\") pod \"route-controller-manager-6576b87f9c-pf6dm\" (UID: \"4104d0d5-52a2-47da-9cd0-86d4c52bc122\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pf6dm" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.273737 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75b01f76-263d-4523-87e1-eb33d01164a7-config\") pod \"machine-approver-56656f9798-flf5c\" (UID: \"75b01f76-263d-4523-87e1-eb33d01164a7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-flf5c" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.273761 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4104d0d5-52a2-47da-9cd0-86d4c52bc122-config\") pod \"route-controller-manager-6576b87f9c-pf6dm\" (UID: \"4104d0d5-52a2-47da-9cd0-86d4c52bc122\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pf6dm" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.273792 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5zpt\" (UniqueName: \"kubernetes.io/projected/5ab29eb0-bca1-422c-a796-6c220bb76022-kube-api-access-q5zpt\") pod \"console-operator-58897d9998-s727n\" (UID: \"5ab29eb0-bca1-422c-a796-6c220bb76022\") " pod="openshift-console-operator/console-operator-58897d9998-s727n" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.273816 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5ab29eb0-bca1-422c-a796-6c220bb76022-trusted-ca\") pod \"console-operator-58897d9998-s727n\" (UID: \"5ab29eb0-bca1-422c-a796-6c220bb76022\") " pod="openshift-console-operator/console-operator-58897d9998-s727n" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.273846 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4104d0d5-52a2-47da-9cd0-86d4c52bc122-serving-cert\") pod \"route-controller-manager-6576b87f9c-pf6dm\" (UID: \"4104d0d5-52a2-47da-9cd0-86d4c52bc122\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pf6dm" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.273888 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e5de8035-7427-4646-ad95-13c58ecad616-available-featuregates\") pod \"openshift-config-operator-7777fb866f-r8rd5\" (UID: \"e5de8035-7427-4646-ad95-13c58ecad616\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-r8rd5" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.273917 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/75b01f76-263d-4523-87e1-eb33d01164a7-machine-approver-tls\") pod \"machine-approver-56656f9798-flf5c\" (UID: \"75b01f76-263d-4523-87e1-eb33d01164a7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-flf5c" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.273948 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75d348d9-43ce-41be-afdf-1a35627522c8-config\") pod \"openshift-apiserver-operator-796bbdcf4f-b9q7t\" (UID: \"75d348d9-43ce-41be-afdf-1a35627522c8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b9q7t" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.273979 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6dvt\" (UniqueName: \"kubernetes.io/projected/e5de8035-7427-4646-ad95-13c58ecad616-kube-api-access-f6dvt\") pod \"openshift-config-operator-7777fb866f-r8rd5\" (UID: \"e5de8035-7427-4646-ad95-13c58ecad616\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-r8rd5" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.274020 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75d348d9-43ce-41be-afdf-1a35627522c8-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-b9q7t\" (UID: \"75d348d9-43ce-41be-afdf-1a35627522c8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b9q7t" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.274072 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5de8035-7427-4646-ad95-13c58ecad616-serving-cert\") pod \"openshift-config-operator-7777fb866f-r8rd5\" (UID: \"e5de8035-7427-4646-ad95-13c58ecad616\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-r8rd5" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.274121 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ab29eb0-bca1-422c-a796-6c220bb76022-serving-cert\") pod \"console-operator-58897d9998-s727n\" (UID: \"5ab29eb0-bca1-422c-a796-6c220bb76022\") " pod="openshift-console-operator/console-operator-58897d9998-s727n" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.274146 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxrnk\" (UniqueName: \"kubernetes.io/projected/75d348d9-43ce-41be-afdf-1a35627522c8-kube-api-access-mxrnk\") pod \"openshift-apiserver-operator-796bbdcf4f-b9q7t\" (UID: \"75d348d9-43ce-41be-afdf-1a35627522c8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b9q7t" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.275174 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75b01f76-263d-4523-87e1-eb33d01164a7-config\") pod \"machine-approver-56656f9798-flf5c\" (UID: \"75b01f76-263d-4523-87e1-eb33d01164a7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-flf5c" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.276353 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ab29eb0-bca1-422c-a796-6c220bb76022-config\") pod \"console-operator-58897d9998-s727n\" (UID: \"5ab29eb0-bca1-422c-a796-6c220bb76022\") " pod="openshift-console-operator/console-operator-58897d9998-s727n" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.276426 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75d348d9-43ce-41be-afdf-1a35627522c8-config\") pod \"openshift-apiserver-operator-796bbdcf4f-b9q7t\" (UID: \"75d348d9-43ce-41be-afdf-1a35627522c8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b9q7t" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.276519 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xkx2c"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.276551 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-87568"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.276792 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/75b01f76-263d-4523-87e1-eb33d01164a7-auth-proxy-config\") pod \"machine-approver-56656f9798-flf5c\" (UID: \"75b01f76-263d-4523-87e1-eb33d01164a7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-flf5c" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.277831 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4104d0d5-52a2-47da-9cd0-86d4c52bc122-client-ca\") pod \"route-controller-manager-6576b87f9c-pf6dm\" (UID: \"4104d0d5-52a2-47da-9cd0-86d4c52bc122\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pf6dm" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.277976 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e5de8035-7427-4646-ad95-13c58ecad616-available-featuregates\") pod \"openshift-config-operator-7777fb866f-r8rd5\" (UID: \"e5de8035-7427-4646-ad95-13c58ecad616\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-r8rd5" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.278907 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-42lsk"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.280000 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-c5szj"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.282750 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vftjt"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.282833 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5ab29eb0-bca1-422c-a796-6c220bb76022-trusted-ca\") pod \"console-operator-58897d9998-s727n\" (UID: \"5ab29eb0-bca1-422c-a796-6c220bb76022\") " pod="openshift-console-operator/console-operator-58897d9998-s727n" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.283923 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qtsxk"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.286533 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-45mp5"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.286604 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nvrnd"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.287227 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ab29eb0-bca1-422c-a796-6c220bb76022-serving-cert\") pod \"console-operator-58897d9998-s727n\" (UID: \"5ab29eb0-bca1-422c-a796-6c220bb76022\") " pod="openshift-console-operator/console-operator-58897d9998-s727n" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.288722 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kbnrw"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.289723 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75d348d9-43ce-41be-afdf-1a35627522c8-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-b9q7t\" (UID: \"75d348d9-43ce-41be-afdf-1a35627522c8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b9q7t" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.290161 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b2tld"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.291623 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-7ghd2"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.292124 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4104d0d5-52a2-47da-9cd0-86d4c52bc122-serving-cert\") pod \"route-controller-manager-6576b87f9c-pf6dm\" (UID: \"4104d0d5-52a2-47da-9cd0-86d4c52bc122\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pf6dm" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.292211 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-422gm"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.292488 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/75b01f76-263d-4523-87e1-eb33d01164a7-machine-approver-tls\") pod \"machine-approver-56656f9798-flf5c\" (UID: \"75b01f76-263d-4523-87e1-eb33d01164a7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-flf5c" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.292876 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5de8035-7427-4646-ad95-13c58ecad616-serving-cert\") pod \"openshift-config-operator-7777fb866f-r8rd5\" (UID: \"e5de8035-7427-4646-ad95-13c58ecad616\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-r8rd5" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.293807 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-nhcxl"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.294358 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4104d0d5-52a2-47da-9cd0-86d4c52bc122-config\") pod \"route-controller-manager-6576b87f9c-pf6dm\" (UID: \"4104d0d5-52a2-47da-9cd0-86d4c52bc122\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pf6dm" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.296512 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5bhmb"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.297356 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-l7mvw"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.300887 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hpfrs"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.300929 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-8fqr9"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.301182 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-l7mvw" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.304852 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8fqr9"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.305026 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8fqr9" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.305389 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-h645w"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.308076 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-h5twj"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.322381 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323290-fh7nk"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.328221 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-hwbdb"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.329896 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hwbdb" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.330340 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.337906 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hwbdb"] Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.346254 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.366938 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.386976 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.414662 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.427120 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.446909 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.466528 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.492578 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.506592 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.527420 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.546357 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.566641 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.586775 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.606714 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.626535 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.646703 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.667047 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.687052 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.707050 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.727274 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.747120 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.766599 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.786926 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.807455 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.827083 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.847477 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.873514 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.887468 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.907609 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.926593 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.954436 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.966746 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 02 09:34:45 crc kubenswrapper[4722]: I1002 09:34:45.987214 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.008009 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.027001 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.047789 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.067904 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.087698 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.113622 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.127430 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.147785 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.166909 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.186646 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.207950 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.225661 4722 request.go:700] Waited for 1.007027091s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator/secrets?fieldSelector=metadata.name%3Dkube-storage-version-migrator-sa-dockercfg-5xfcg&limit=500&resourceVersion=0 Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.227294 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.247797 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.267422 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.287831 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.306720 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.328338 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.348847 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.366922 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.387863 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.406756 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.437746 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.447916 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.467508 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.487124 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.508583 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.527726 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.548672 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.567933 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.587537 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.611119 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.628296 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.647871 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.721795 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.722461 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.722549 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.725715 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.748045 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.767086 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.787962 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.807647 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.827274 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.848613 4722 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.866722 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.886624 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.929399 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxrnk\" (UniqueName: \"kubernetes.io/projected/75d348d9-43ce-41be-afdf-1a35627522c8-kube-api-access-mxrnk\") pod \"openshift-apiserver-operator-796bbdcf4f-b9q7t\" (UID: \"75d348d9-43ce-41be-afdf-1a35627522c8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b9q7t" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.949267 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fghnh\" (UniqueName: \"kubernetes.io/projected/4104d0d5-52a2-47da-9cd0-86d4c52bc122-kube-api-access-fghnh\") pod \"route-controller-manager-6576b87f9c-pf6dm\" (UID: \"4104d0d5-52a2-47da-9cd0-86d4c52bc122\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pf6dm" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.964161 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g2vg\" (UniqueName: \"kubernetes.io/projected/75b01f76-263d-4523-87e1-eb33d01164a7-kube-api-access-4g2vg\") pod \"machine-approver-56656f9798-flf5c\" (UID: \"75b01f76-263d-4523-87e1-eb33d01164a7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-flf5c" Oct 02 09:34:46 crc kubenswrapper[4722]: I1002 09:34:46.985448 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5zpt\" (UniqueName: \"kubernetes.io/projected/5ab29eb0-bca1-422c-a796-6c220bb76022-kube-api-access-q5zpt\") pod \"console-operator-58897d9998-s727n\" (UID: \"5ab29eb0-bca1-422c-a796-6c220bb76022\") " pod="openshift-console-operator/console-operator-58897d9998-s727n" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.012090 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6dvt\" (UniqueName: \"kubernetes.io/projected/e5de8035-7427-4646-ad95-13c58ecad616-kube-api-access-f6dvt\") pod \"openshift-config-operator-7777fb866f-r8rd5\" (UID: \"e5de8035-7427-4646-ad95-13c58ecad616\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-r8rd5" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.012613 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-s727n" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.022480 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpzbq\" (UniqueName: \"kubernetes.io/projected/77902092-381a-451e-b462-de97532929fe-kube-api-access-fpzbq\") pod \"apiserver-7bbb656c7d-fpttx\" (UID: \"77902092-381a-451e-b462-de97532929fe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fpttx" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.022515 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjqnv\" (UniqueName: \"kubernetes.io/projected/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-kube-api-access-qjqnv\") pod \"oauth-openshift-558db77b4-ljxl7\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.022535 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4de37de4-2c3b-4742-96fe-e63227dfc04b-metrics-tls\") pod \"ingress-operator-5b745b69d9-954gs\" (UID: \"4de37de4-2c3b-4742-96fe-e63227dfc04b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-954gs" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.022563 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6898b032-cd5c-42cf-8ebf-df3fec127c5e-console-config\") pod \"console-f9d7485db-clnnl\" (UID: \"6898b032-cd5c-42cf-8ebf-df3fec127c5e\") " pod="openshift-console/console-f9d7485db-clnnl" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.022582 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-ljxl7\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.022603 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/160f98c9-f6b0-4d80-ab97-37487b03a785-registry-certificates\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.022630 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/722a2a42-6f33-4002-ab72-3544df821d3b-image-import-ca\") pod \"apiserver-76f77b778f-jkt7j\" (UID: \"722a2a42-6f33-4002-ab72-3544df821d3b\") " pod="openshift-apiserver/apiserver-76f77b778f-jkt7j" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.022649 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/722a2a42-6f33-4002-ab72-3544df821d3b-serving-cert\") pod \"apiserver-76f77b778f-jkt7j\" (UID: \"722a2a42-6f33-4002-ab72-3544df821d3b\") " pod="openshift-apiserver/apiserver-76f77b778f-jkt7j" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.022718 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/86bc5bcd-672b-4bb6-8090-1ed2d77fff50-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-krj4j\" (UID: \"86bc5bcd-672b-4bb6-8090-1ed2d77fff50\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-krj4j" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.022764 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9db8e859-4683-4df7-87e2-b84a8113cb35-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dgwxp\" (UID: \"9db8e859-4683-4df7-87e2-b84a8113cb35\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dgwxp" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.022790 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9db8e859-4683-4df7-87e2-b84a8113cb35-service-ca-bundle\") pod \"authentication-operator-69f744f599-dgwxp\" (UID: \"9db8e859-4683-4df7-87e2-b84a8113cb35\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dgwxp" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.022809 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/160f98c9-f6b0-4d80-ab97-37487b03a785-installation-pull-secrets\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.022827 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn88h\" (UniqueName: \"kubernetes.io/projected/722a2a42-6f33-4002-ab72-3544df821d3b-kube-api-access-hn88h\") pod \"apiserver-76f77b778f-jkt7j\" (UID: \"722a2a42-6f33-4002-ab72-3544df821d3b\") " pod="openshift-apiserver/apiserver-76f77b778f-jkt7j" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.022846 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4de37de4-2c3b-4742-96fe-e63227dfc04b-trusted-ca\") pod \"ingress-operator-5b745b69d9-954gs\" (UID: \"4de37de4-2c3b-4742-96fe-e63227dfc04b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-954gs" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.022901 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/722a2a42-6f33-4002-ab72-3544df821d3b-audit\") pod \"apiserver-76f77b778f-jkt7j\" (UID: \"722a2a42-6f33-4002-ab72-3544df821d3b\") " pod="openshift-apiserver/apiserver-76f77b778f-jkt7j" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.022952 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/722a2a42-6f33-4002-ab72-3544df821d3b-audit-dir\") pod \"apiserver-76f77b778f-jkt7j\" (UID: \"722a2a42-6f33-4002-ab72-3544df821d3b\") " pod="openshift-apiserver/apiserver-76f77b778f-jkt7j" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.022983 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b92581b-78c7-41ba-adac-0e8336d81508-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-cr6pt\" (UID: \"9b92581b-78c7-41ba-adac-0e8336d81508\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cr6pt" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.023015 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbjf4\" (UniqueName: \"kubernetes.io/projected/6898b032-cd5c-42cf-8ebf-df3fec127c5e-kube-api-access-sbjf4\") pod \"console-f9d7485db-clnnl\" (UID: \"6898b032-cd5c-42cf-8ebf-df3fec127c5e\") " pod="openshift-console/console-f9d7485db-clnnl" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.023040 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-ljxl7\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.023070 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-ljxl7\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.023127 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77902092-381a-451e-b462-de97532929fe-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-fpttx\" (UID: \"77902092-381a-451e-b462-de97532929fe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fpttx" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.023158 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-audit-policies\") pod \"oauth-openshift-558db77b4-ljxl7\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.023185 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-ljxl7\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.023228 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t2q9\" (UniqueName: \"kubernetes.io/projected/86bc5bcd-672b-4bb6-8090-1ed2d77fff50-kube-api-access-4t2q9\") pod \"machine-api-operator-5694c8668f-krj4j\" (UID: \"86bc5bcd-672b-4bb6-8090-1ed2d77fff50\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-krj4j" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.023256 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9db8e859-4683-4df7-87e2-b84a8113cb35-config\") pod \"authentication-operator-69f744f599-dgwxp\" (UID: \"9db8e859-4683-4df7-87e2-b84a8113cb35\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dgwxp" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.023282 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a-metrics-certs\") pod \"router-default-5444994796-6xfhq\" (UID: \"fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a\") " pod="openshift-ingress/router-default-5444994796-6xfhq" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.023580 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/77902092-381a-451e-b462-de97532929fe-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-fpttx\" (UID: \"77902092-381a-451e-b462-de97532929fe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fpttx" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.023615 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ebdfe0bc-aee2-4b9e-bea3-e226541a08ea-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-z5zmm\" (UID: \"ebdfe0bc-aee2-4b9e-bea3-e226541a08ea\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z5zmm" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.023635 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6898b032-cd5c-42cf-8ebf-df3fec127c5e-console-serving-cert\") pod \"console-f9d7485db-clnnl\" (UID: \"6898b032-cd5c-42cf-8ebf-df3fec127c5e\") " pod="openshift-console/console-f9d7485db-clnnl" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.023670 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-ljxl7\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.023689 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-ljxl7\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.023761 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.023868 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zdf4\" (UniqueName: \"kubernetes.io/projected/9db8e859-4683-4df7-87e2-b84a8113cb35-kube-api-access-9zdf4\") pod \"authentication-operator-69f744f599-dgwxp\" (UID: \"9db8e859-4683-4df7-87e2-b84a8113cb35\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dgwxp" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.023898 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2gv4\" (UniqueName: \"kubernetes.io/projected/e6070617-4dba-40fb-a6a8-03f6807a6846-kube-api-access-g2gv4\") pod \"controller-manager-879f6c89f-zk6t7\" (UID: \"e6070617-4dba-40fb-a6a8-03f6807a6846\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zk6t7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.023932 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/160f98c9-f6b0-4d80-ab97-37487b03a785-bound-sa-token\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.023952 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/722a2a42-6f33-4002-ab72-3544df821d3b-config\") pod \"apiserver-76f77b778f-jkt7j\" (UID: \"722a2a42-6f33-4002-ab72-3544df821d3b\") " pod="openshift-apiserver/apiserver-76f77b778f-jkt7j" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.023984 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/77902092-381a-451e-b462-de97532929fe-encryption-config\") pod \"apiserver-7bbb656c7d-fpttx\" (UID: \"77902092-381a-451e-b462-de97532929fe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fpttx" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.024020 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-ljxl7\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.024306 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/160f98c9-f6b0-4d80-ab97-37487b03a785-trusted-ca\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.024368 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6898b032-cd5c-42cf-8ebf-df3fec127c5e-console-oauth-config\") pod \"console-f9d7485db-clnnl\" (UID: \"6898b032-cd5c-42cf-8ebf-df3fec127c5e\") " pod="openshift-console/console-f9d7485db-clnnl" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.024398 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6898b032-cd5c-42cf-8ebf-df3fec127c5e-trusted-ca-bundle\") pod \"console-f9d7485db-clnnl\" (UID: \"6898b032-cd5c-42cf-8ebf-df3fec127c5e\") " pod="openshift-console/console-f9d7485db-clnnl" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.024436 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6070617-4dba-40fb-a6a8-03f6807a6846-serving-cert\") pod \"controller-manager-879f6c89f-zk6t7\" (UID: \"e6070617-4dba-40fb-a6a8-03f6807a6846\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zk6t7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.024469 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/722a2a42-6f33-4002-ab72-3544df821d3b-encryption-config\") pod \"apiserver-76f77b778f-jkt7j\" (UID: \"722a2a42-6f33-4002-ab72-3544df821d3b\") " pod="openshift-apiserver/apiserver-76f77b778f-jkt7j" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.024489 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9db8e859-4683-4df7-87e2-b84a8113cb35-serving-cert\") pod \"authentication-operator-69f744f599-dgwxp\" (UID: \"9db8e859-4683-4df7-87e2-b84a8113cb35\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dgwxp" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.024511 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xlgs\" (UniqueName: \"kubernetes.io/projected/7fb5f55e-6d4f-465a-9e93-45a36a2ce4b3-kube-api-access-9xlgs\") pod \"downloads-7954f5f757-mfhl4\" (UID: \"7fb5f55e-6d4f-465a-9e93-45a36a2ce4b3\") " pod="openshift-console/downloads-7954f5f757-mfhl4" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.024574 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77902092-381a-451e-b462-de97532929fe-serving-cert\") pod \"apiserver-7bbb656c7d-fpttx\" (UID: \"77902092-381a-451e-b462-de97532929fe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fpttx" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.024600 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a-stats-auth\") pod \"router-default-5444994796-6xfhq\" (UID: \"fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a\") " pod="openshift-ingress/router-default-5444994796-6xfhq" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.024623 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9b92581b-78c7-41ba-adac-0e8336d81508-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-cr6pt\" (UID: \"9b92581b-78c7-41ba-adac-0e8336d81508\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cr6pt" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.024981 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-audit-dir\") pod \"oauth-openshift-558db77b4-ljxl7\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.025016 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/77902092-381a-451e-b462-de97532929fe-audit-policies\") pod \"apiserver-7bbb656c7d-fpttx\" (UID: \"77902092-381a-451e-b462-de97532929fe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fpttx" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.025095 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-942gl\" (UniqueName: \"kubernetes.io/projected/9b92581b-78c7-41ba-adac-0e8336d81508-kube-api-access-942gl\") pod \"cluster-image-registry-operator-dc59b4c8b-cr6pt\" (UID: \"9b92581b-78c7-41ba-adac-0e8336d81508\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cr6pt" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.025142 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94f8n\" (UniqueName: \"kubernetes.io/projected/4de37de4-2c3b-4742-96fe-e63227dfc04b-kube-api-access-94f8n\") pod \"ingress-operator-5b745b69d9-954gs\" (UID: \"4de37de4-2c3b-4742-96fe-e63227dfc04b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-954gs" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.025221 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw9x6\" (UniqueName: \"kubernetes.io/projected/160f98c9-f6b0-4d80-ab97-37487b03a785-kube-api-access-lw9x6\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.025241 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/722a2a42-6f33-4002-ab72-3544df821d3b-node-pullsecrets\") pod \"apiserver-76f77b778f-jkt7j\" (UID: \"722a2a42-6f33-4002-ab72-3544df821d3b\") " pod="openshift-apiserver/apiserver-76f77b778f-jkt7j" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.025262 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/722a2a42-6f33-4002-ab72-3544df821d3b-etcd-client\") pod \"apiserver-76f77b778f-jkt7j\" (UID: \"722a2a42-6f33-4002-ab72-3544df821d3b\") " pod="openshift-apiserver/apiserver-76f77b778f-jkt7j" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.025327 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/160f98c9-f6b0-4d80-ab97-37487b03a785-ca-trust-extracted\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:47 crc kubenswrapper[4722]: E1002 09:34:47.025371 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:47.525341861 +0000 UTC m=+143.562095011 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.025468 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86bc5bcd-672b-4bb6-8090-1ed2d77fff50-config\") pod \"machine-api-operator-5694c8668f-krj4j\" (UID: \"86bc5bcd-672b-4bb6-8090-1ed2d77fff50\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-krj4j" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.025523 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbjfd\" (UniqueName: \"kubernetes.io/projected/ebdfe0bc-aee2-4b9e-bea3-e226541a08ea-kube-api-access-tbjfd\") pod \"cluster-samples-operator-665b6dd947-z5zmm\" (UID: \"ebdfe0bc-aee2-4b9e-bea3-e226541a08ea\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z5zmm" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.025575 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6070617-4dba-40fb-a6a8-03f6807a6846-config\") pod \"controller-manager-879f6c89f-zk6t7\" (UID: \"e6070617-4dba-40fb-a6a8-03f6807a6846\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zk6t7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.025600 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4de37de4-2c3b-4742-96fe-e63227dfc04b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-954gs\" (UID: \"4de37de4-2c3b-4742-96fe-e63227dfc04b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-954gs" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.025662 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/722a2a42-6f33-4002-ab72-3544df821d3b-trusted-ca-bundle\") pod \"apiserver-76f77b778f-jkt7j\" (UID: \"722a2a42-6f33-4002-ab72-3544df821d3b\") " pod="openshift-apiserver/apiserver-76f77b778f-jkt7j" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.025684 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6898b032-cd5c-42cf-8ebf-df3fec127c5e-oauth-serving-cert\") pod \"console-f9d7485db-clnnl\" (UID: \"6898b032-cd5c-42cf-8ebf-df3fec127c5e\") " pod="openshift-console/console-f9d7485db-clnnl" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.025710 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-ljxl7\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.025734 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/160f98c9-f6b0-4d80-ab97-37487b03a785-registry-tls\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.025754 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-ljxl7\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.025783 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/86bc5bcd-672b-4bb6-8090-1ed2d77fff50-images\") pod \"machine-api-operator-5694c8668f-krj4j\" (UID: \"86bc5bcd-672b-4bb6-8090-1ed2d77fff50\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-krj4j" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.025808 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a-default-certificate\") pod \"router-default-5444994796-6xfhq\" (UID: \"fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a\") " pod="openshift-ingress/router-default-5444994796-6xfhq" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.025855 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/722a2a42-6f33-4002-ab72-3544df821d3b-etcd-serving-ca\") pod \"apiserver-76f77b778f-jkt7j\" (UID: \"722a2a42-6f33-4002-ab72-3544df821d3b\") " pod="openshift-apiserver/apiserver-76f77b778f-jkt7j" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.025896 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a-service-ca-bundle\") pod \"router-default-5444994796-6xfhq\" (UID: \"fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a\") " pod="openshift-ingress/router-default-5444994796-6xfhq" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.025917 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e6070617-4dba-40fb-a6a8-03f6807a6846-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-zk6t7\" (UID: \"e6070617-4dba-40fb-a6a8-03f6807a6846\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zk6t7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.026017 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e6070617-4dba-40fb-a6a8-03f6807a6846-client-ca\") pod \"controller-manager-879f6c89f-zk6t7\" (UID: \"e6070617-4dba-40fb-a6a8-03f6807a6846\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zk6t7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.026115 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-ljxl7\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.026163 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmcrw\" (UniqueName: \"kubernetes.io/projected/fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a-kube-api-access-tmcrw\") pod \"router-default-5444994796-6xfhq\" (UID: \"fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a\") " pod="openshift-ingress/router-default-5444994796-6xfhq" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.026189 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-ljxl7\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.026216 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/77902092-381a-451e-b462-de97532929fe-etcd-client\") pod \"apiserver-7bbb656c7d-fpttx\" (UID: \"77902092-381a-451e-b462-de97532929fe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fpttx" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.026234 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/77902092-381a-451e-b462-de97532929fe-audit-dir\") pod \"apiserver-7bbb656c7d-fpttx\" (UID: \"77902092-381a-451e-b462-de97532929fe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fpttx" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.026254 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6898b032-cd5c-42cf-8ebf-df3fec127c5e-service-ca\") pod \"console-f9d7485db-clnnl\" (UID: \"6898b032-cd5c-42cf-8ebf-df3fec127c5e\") " pod="openshift-console/console-f9d7485db-clnnl" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.026285 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9b92581b-78c7-41ba-adac-0e8336d81508-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-cr6pt\" (UID: \"9b92581b-78c7-41ba-adac-0e8336d81508\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cr6pt" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.026916 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pf6dm" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.027093 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.047427 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.059175 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-flf5c" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.068002 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.088437 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.108074 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.109290 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-r8rd5" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.127010 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.127524 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/160f98c9-f6b0-4d80-ab97-37487b03a785-trusted-ca\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.127558 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6898b032-cd5c-42cf-8ebf-df3fec127c5e-console-oauth-config\") pod \"console-f9d7485db-clnnl\" (UID: \"6898b032-cd5c-42cf-8ebf-df3fec127c5e\") " pod="openshift-console/console-f9d7485db-clnnl" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.127576 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6898b032-cd5c-42cf-8ebf-df3fec127c5e-trusted-ca-bundle\") pod \"console-f9d7485db-clnnl\" (UID: \"6898b032-cd5c-42cf-8ebf-df3fec127c5e\") " pod="openshift-console/console-f9d7485db-clnnl" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.127599 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb13bcef-7d21-460c-851d-96eeb995f612-metrics-tls\") pod \"dns-default-hwbdb\" (UID: \"cb13bcef-7d21-460c-851d-96eeb995f612\") " pod="openshift-dns/dns-default-hwbdb" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.127618 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/22202c1d-bb44-4648-9e5b-d8a29a94003f-signing-key\") pod \"service-ca-9c57cc56f-h5twj\" (UID: \"22202c1d-bb44-4648-9e5b-d8a29a94003f\") " pod="openshift-service-ca/service-ca-9c57cc56f-h5twj" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.127639 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9b92581b-78c7-41ba-adac-0e8336d81508-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-cr6pt\" (UID: \"9b92581b-78c7-41ba-adac-0e8336d81508\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cr6pt" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.127658 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bf4132d7-ccbc-4e9a-adb2-2b92ee9165d8-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-7ghd2\" (UID: \"bf4132d7-ccbc-4e9a-adb2-2b92ee9165d8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7ghd2" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.127688 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-audit-dir\") pod \"oauth-openshift-558db77b4-ljxl7\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.127707 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2186d01d-5c96-4c50-9414-bce7cd2d7052-socket-dir\") pod \"csi-hostpathplugin-h645w\" (UID: \"2186d01d-5c96-4c50-9414-bce7cd2d7052\") " pod="hostpath-provisioner/csi-hostpathplugin-h645w" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.127725 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8kvn\" (UniqueName: \"kubernetes.io/projected/9b22e0a2-1fea-4571-ad2b-e639bed4832c-kube-api-access-g8kvn\") pod \"machine-config-server-l7mvw\" (UID: \"9b22e0a2-1fea-4571-ad2b-e639bed4832c\") " pod="openshift-machine-config-operator/machine-config-server-l7mvw" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.127746 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk8bv\" (UniqueName: \"kubernetes.io/projected/d30e5a09-5492-426b-89a9-2331cce3cc87-kube-api-access-mk8bv\") pod \"collect-profiles-29323290-fh7nk\" (UID: \"d30e5a09-5492-426b-89a9-2331cce3cc87\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323290-fh7nk" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.127764 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94f8n\" (UniqueName: \"kubernetes.io/projected/4de37de4-2c3b-4742-96fe-e63227dfc04b-kube-api-access-94f8n\") pod \"ingress-operator-5b745b69d9-954gs\" (UID: \"4de37de4-2c3b-4742-96fe-e63227dfc04b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-954gs" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.127780 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/2186d01d-5c96-4c50-9414-bce7cd2d7052-csi-data-dir\") pod \"csi-hostpathplugin-h645w\" (UID: \"2186d01d-5c96-4c50-9414-bce7cd2d7052\") " pod="hostpath-provisioner/csi-hostpathplugin-h645w" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.127798 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2a98cf7d-1ef8-421e-85be-1bc5310adfdc-profile-collector-cert\") pod \"catalog-operator-68c6474976-5bhmb\" (UID: \"2a98cf7d-1ef8-421e-85be-1bc5310adfdc\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5bhmb" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.127818 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bnd6\" (UniqueName: \"kubernetes.io/projected/ddbb576c-7cc0-4b9e-89b2-6ca265b6aa6b-kube-api-access-2bnd6\") pod \"machine-config-controller-84d6567774-42lsk\" (UID: \"ddbb576c-7cc0-4b9e-89b2-6ca265b6aa6b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-42lsk" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.127833 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28604003-081b-4192-af14-bab8aab39d15-serving-cert\") pod \"etcd-operator-b45778765-xkx2c\" (UID: \"28604003-081b-4192-af14-bab8aab39d15\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xkx2c" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.127850 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2537cd59-b6d6-46da-a99f-3f6306a19fdb-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-kbnrw\" (UID: \"2537cd59-b6d6-46da-a99f-3f6306a19fdb\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kbnrw" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.127871 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/160f98c9-f6b0-4d80-ab97-37487b03a785-ca-trust-extracted\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.127887 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxhrx\" (UniqueName: \"kubernetes.io/projected/2186d01d-5c96-4c50-9414-bce7cd2d7052-kube-api-access-zxhrx\") pod \"csi-hostpathplugin-h645w\" (UID: \"2186d01d-5c96-4c50-9414-bce7cd2d7052\") " pod="hostpath-provisioner/csi-hostpathplugin-h645w" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.127906 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86bc5bcd-672b-4bb6-8090-1ed2d77fff50-config\") pod \"machine-api-operator-5694c8668f-krj4j\" (UID: \"86bc5bcd-672b-4bb6-8090-1ed2d77fff50\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-krj4j" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.127923 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/2eb914e0-ee72-482c-a8d0-ba6b8f9f95d1-tmpfs\") pod \"packageserver-d55dfcdfc-b2tld\" (UID: \"2eb914e0-ee72-482c-a8d0-ba6b8f9f95d1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b2tld" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.127940 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6898b032-cd5c-42cf-8ebf-df3fec127c5e-oauth-serving-cert\") pod \"console-f9d7485db-clnnl\" (UID: \"6898b032-cd5c-42cf-8ebf-df3fec127c5e\") " pod="openshift-console/console-f9d7485db-clnnl" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.127955 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/2186d01d-5c96-4c50-9414-bce7cd2d7052-plugins-dir\") pod \"csi-hostpathplugin-h645w\" (UID: \"2186d01d-5c96-4c50-9414-bce7cd2d7052\") " pod="hostpath-provisioner/csi-hostpathplugin-h645w" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.127974 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/160f98c9-f6b0-4d80-ab97-37487b03a785-registry-tls\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.127990 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/86bc5bcd-672b-4bb6-8090-1ed2d77fff50-images\") pod \"machine-api-operator-5694c8668f-krj4j\" (UID: \"86bc5bcd-672b-4bb6-8090-1ed2d77fff50\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-krj4j" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.128008 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk2t2\" (UniqueName: \"kubernetes.io/projected/31485bfd-8f43-4046-a188-5b69955b5566-kube-api-access-qk2t2\") pod \"olm-operator-6b444d44fb-422gm\" (UID: \"31485bfd-8f43-4046-a188-5b69955b5566\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-422gm" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.128028 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a9fbcec9-73cc-4d46-88d5-3596af15db5b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vftjt\" (UID: \"a9fbcec9-73cc-4d46-88d5-3596af15db5b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vftjt" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.128047 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a-service-ca-bundle\") pod \"router-default-5444994796-6xfhq\" (UID: \"fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a\") " pod="openshift-ingress/router-default-5444994796-6xfhq" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.128063 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e6070617-4dba-40fb-a6a8-03f6807a6846-client-ca\") pod \"controller-manager-879f6c89f-zk6t7\" (UID: \"e6070617-4dba-40fb-a6a8-03f6807a6846\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zk6t7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.128079 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-ljxl7\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.128098 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf859e5d-c395-4892-9abf-e1a9a9d21176-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-hpfrs\" (UID: \"bf859e5d-c395-4892-9abf-e1a9a9d21176\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hpfrs" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.128117 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b59e4a45-c4df-4754-8e3e-83e750aee402-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qtsxk\" (UID: \"b59e4a45-c4df-4754-8e3e-83e750aee402\") " pod="openshift-marketplace/marketplace-operator-79b997595-qtsxk" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.128133 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92d2f480-8dcd-483c-ba02-41a2027b12ae-serving-cert\") pod \"service-ca-operator-777779d784-c5szj\" (UID: \"92d2f480-8dcd-483c-ba02-41a2027b12ae\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c5szj" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.128149 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5fcdab41-6e2e-46fe-b9d0-1411949a51e9-metrics-tls\") pod \"dns-operator-744455d44c-xpzp4\" (UID: \"5fcdab41-6e2e-46fe-b9d0-1411949a51e9\") " pod="openshift-dns-operator/dns-operator-744455d44c-xpzp4" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.128165 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d30e5a09-5492-426b-89a9-2331cce3cc87-secret-volume\") pod \"collect-profiles-29323290-fh7nk\" (UID: \"d30e5a09-5492-426b-89a9-2331cce3cc87\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323290-fh7nk" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.128182 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v92ss\" (UniqueName: \"kubernetes.io/projected/28604003-081b-4192-af14-bab8aab39d15-kube-api-access-v92ss\") pod \"etcd-operator-b45778765-xkx2c\" (UID: \"28604003-081b-4192-af14-bab8aab39d15\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xkx2c" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.128196 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ddbb576c-7cc0-4b9e-89b2-6ca265b6aa6b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-42lsk\" (UID: \"ddbb576c-7cc0-4b9e-89b2-6ca265b6aa6b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-42lsk" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.128213 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6898b032-cd5c-42cf-8ebf-df3fec127c5e-service-ca\") pod \"console-f9d7485db-clnnl\" (UID: \"6898b032-cd5c-42cf-8ebf-df3fec127c5e\") " pod="openshift-console/console-f9d7485db-clnnl" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.128229 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5c5d5270-f70e-4691-8bbd-ed6bbf5ac09d-auth-proxy-config\") pod \"machine-config-operator-74547568cd-nhcxl\" (UID: \"5c5d5270-f70e-4691-8bbd-ed6bbf5ac09d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nhcxl" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.128245 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/77902092-381a-451e-b462-de97532929fe-etcd-client\") pod \"apiserver-7bbb656c7d-fpttx\" (UID: \"77902092-381a-451e-b462-de97532929fe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fpttx" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.128282 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/77902092-381a-451e-b462-de97532929fe-audit-dir\") pod \"apiserver-7bbb656c7d-fpttx\" (UID: \"77902092-381a-451e-b462-de97532929fe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fpttx" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.128299 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpzbq\" (UniqueName: \"kubernetes.io/projected/77902092-381a-451e-b462-de97532929fe-kube-api-access-fpzbq\") pod \"apiserver-7bbb656c7d-fpttx\" (UID: \"77902092-381a-451e-b462-de97532929fe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fpttx" Oct 02 09:34:47 crc kubenswrapper[4722]: E1002 09:34:47.128346 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:47.628310072 +0000 UTC m=+143.665063052 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.128395 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6898b032-cd5c-42cf-8ebf-df3fec127c5e-console-config\") pod \"console-f9d7485db-clnnl\" (UID: \"6898b032-cd5c-42cf-8ebf-df3fec127c5e\") " pod="openshift-console/console-f9d7485db-clnnl" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.128418 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/160f98c9-f6b0-4d80-ab97-37487b03a785-registry-certificates\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.128446 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/722a2a42-6f33-4002-ab72-3544df821d3b-image-import-ca\") pod \"apiserver-76f77b778f-jkt7j\" (UID: \"722a2a42-6f33-4002-ab72-3544df821d3b\") " pod="openshift-apiserver/apiserver-76f77b778f-jkt7j" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.128464 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/722a2a42-6f33-4002-ab72-3544df821d3b-serving-cert\") pod \"apiserver-76f77b778f-jkt7j\" (UID: \"722a2a42-6f33-4002-ab72-3544df821d3b\") " pod="openshift-apiserver/apiserver-76f77b778f-jkt7j" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.128486 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wns8\" (UniqueName: \"kubernetes.io/projected/69b26642-0e96-47a5-9154-9866670cb9e4-kube-api-access-7wns8\") pod \"migrator-59844c95c7-87568\" (UID: \"69b26642-0e96-47a5-9154-9866670cb9e4\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-87568" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.128506 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/86bc5bcd-672b-4bb6-8090-1ed2d77fff50-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-krj4j\" (UID: \"86bc5bcd-672b-4bb6-8090-1ed2d77fff50\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-krj4j" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.128525 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9db8e859-4683-4df7-87e2-b84a8113cb35-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dgwxp\" (UID: \"9db8e859-4683-4df7-87e2-b84a8113cb35\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dgwxp" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.128544 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9db8e859-4683-4df7-87e2-b84a8113cb35-service-ca-bundle\") pod \"authentication-operator-69f744f599-dgwxp\" (UID: \"9db8e859-4683-4df7-87e2-b84a8113cb35\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dgwxp" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.128561 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2186d01d-5c96-4c50-9414-bce7cd2d7052-registration-dir\") pod \"csi-hostpathplugin-h645w\" (UID: \"2186d01d-5c96-4c50-9414-bce7cd2d7052\") " pod="hostpath-provisioner/csi-hostpathplugin-h645w" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.128580 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d30e5a09-5492-426b-89a9-2331cce3cc87-config-volume\") pod \"collect-profiles-29323290-fh7nk\" (UID: \"d30e5a09-5492-426b-89a9-2331cce3cc87\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323290-fh7nk" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.128608 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d79d2f7-de31-4ae6-b630-762dd6a5bef3-config\") pod \"kube-apiserver-operator-766d6c64bb-45mp5\" (UID: \"4d79d2f7-de31-4ae6-b630-762dd6a5bef3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-45mp5" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.128624 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6kxc\" (UniqueName: \"kubernetes.io/projected/5fcdab41-6e2e-46fe-b9d0-1411949a51e9-kube-api-access-k6kxc\") pod \"dns-operator-744455d44c-xpzp4\" (UID: \"5fcdab41-6e2e-46fe-b9d0-1411949a51e9\") " pod="openshift-dns-operator/dns-operator-744455d44c-xpzp4" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.128644 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/84cae530-6db3-4e7e-bf5f-b4e25ce9d992-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nvrnd\" (UID: \"84cae530-6db3-4e7e-bf5f-b4e25ce9d992\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nvrnd" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.128664 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d79d2f7-de31-4ae6-b630-762dd6a5bef3-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-45mp5\" (UID: \"4d79d2f7-de31-4ae6-b630-762dd6a5bef3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-45mp5" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.128689 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b92581b-78c7-41ba-adac-0e8336d81508-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-cr6pt\" (UID: \"9b92581b-78c7-41ba-adac-0e8336d81508\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cr6pt" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.128705 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.128712 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9365ecd-353b-4623-a174-45d558e9d16d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-d4656\" (UID: \"f9365ecd-353b-4623-a174-45d558e9d16d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d4656" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.128955 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcxxs\" (UniqueName: \"kubernetes.io/projected/2a98cf7d-1ef8-421e-85be-1bc5310adfdc-kube-api-access-lcxxs\") pod \"catalog-operator-68c6474976-5bhmb\" (UID: \"2a98cf7d-1ef8-421e-85be-1bc5310adfdc\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5bhmb" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.128981 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/2186d01d-5c96-4c50-9414-bce7cd2d7052-mountpoint-dir\") pod \"csi-hostpathplugin-h645w\" (UID: \"2186d01d-5c96-4c50-9414-bce7cd2d7052\") " pod="hostpath-provisioner/csi-hostpathplugin-h645w" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.129013 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-audit-policies\") pod \"oauth-openshift-558db77b4-ljxl7\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.129040 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t2q9\" (UniqueName: \"kubernetes.io/projected/86bc5bcd-672b-4bb6-8090-1ed2d77fff50-kube-api-access-4t2q9\") pod \"machine-api-operator-5694c8668f-krj4j\" (UID: \"86bc5bcd-672b-4bb6-8090-1ed2d77fff50\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-krj4j" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.129059 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a-metrics-certs\") pod \"router-default-5444994796-6xfhq\" (UID: \"fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a\") " pod="openshift-ingress/router-default-5444994796-6xfhq" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.129079 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e6051c5a-7df2-4c89-a0c8-03d2bd3c2cc2-cert\") pod \"ingress-canary-8fqr9\" (UID: \"e6051c5a-7df2-4c89-a0c8-03d2bd3c2cc2\") " pod="openshift-ingress-canary/ingress-canary-8fqr9" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.129103 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/22202c1d-bb44-4648-9e5b-d8a29a94003f-signing-cabundle\") pod \"service-ca-9c57cc56f-h5twj\" (UID: \"22202c1d-bb44-4648-9e5b-d8a29a94003f\") " pod="openshift-service-ca/service-ca-9c57cc56f-h5twj" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.129137 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/31485bfd-8f43-4046-a188-5b69955b5566-profile-collector-cert\") pod \"olm-operator-6b444d44fb-422gm\" (UID: \"31485bfd-8f43-4046-a188-5b69955b5566\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-422gm" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.129168 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2eb914e0-ee72-482c-a8d0-ba6b8f9f95d1-webhook-cert\") pod \"packageserver-d55dfcdfc-b2tld\" (UID: \"2eb914e0-ee72-482c-a8d0-ba6b8f9f95d1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b2tld" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.129209 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ebdfe0bc-aee2-4b9e-bea3-e226541a08ea-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-z5zmm\" (UID: \"ebdfe0bc-aee2-4b9e-bea3-e226541a08ea\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z5zmm" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.129226 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6898b032-cd5c-42cf-8ebf-df3fec127c5e-console-serving-cert\") pod \"console-f9d7485db-clnnl\" (UID: \"6898b032-cd5c-42cf-8ebf-df3fec127c5e\") " pod="openshift-console/console-f9d7485db-clnnl" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.129248 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-ljxl7\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.129292 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.129324 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zdf4\" (UniqueName: \"kubernetes.io/projected/9db8e859-4683-4df7-87e2-b84a8113cb35-kube-api-access-9zdf4\") pod \"authentication-operator-69f744f599-dgwxp\" (UID: \"9db8e859-4683-4df7-87e2-b84a8113cb35\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dgwxp" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.129347 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mblch\" (UniqueName: \"kubernetes.io/projected/bf859e5d-c395-4892-9abf-e1a9a9d21176-kube-api-access-mblch\") pod \"control-plane-machine-set-operator-78cbb6b69f-hpfrs\" (UID: \"bf859e5d-c395-4892-9abf-e1a9a9d21176\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hpfrs" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.129370 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-ljxl7\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.129389 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84cae530-6db3-4e7e-bf5f-b4e25ce9d992-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nvrnd\" (UID: \"84cae530-6db3-4e7e-bf5f-b4e25ce9d992\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nvrnd" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.129407 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/9b22e0a2-1fea-4571-ad2b-e639bed4832c-certs\") pod \"machine-config-server-l7mvw\" (UID: \"9b22e0a2-1fea-4571-ad2b-e639bed4832c\") " pod="openshift-machine-config-operator/machine-config-server-l7mvw" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.129427 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/160f98c9-f6b0-4d80-ab97-37487b03a785-bound-sa-token\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.129465 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6070617-4dba-40fb-a6a8-03f6807a6846-serving-cert\") pod \"controller-manager-879f6c89f-zk6t7\" (UID: \"e6070617-4dba-40fb-a6a8-03f6807a6846\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zk6t7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.129489 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9db8e859-4683-4df7-87e2-b84a8113cb35-serving-cert\") pod \"authentication-operator-69f744f599-dgwxp\" (UID: \"9db8e859-4683-4df7-87e2-b84a8113cb35\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dgwxp" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.129508 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xlgs\" (UniqueName: \"kubernetes.io/projected/7fb5f55e-6d4f-465a-9e93-45a36a2ce4b3-kube-api-access-9xlgs\") pod \"downloads-7954f5f757-mfhl4\" (UID: \"7fb5f55e-6d4f-465a-9e93-45a36a2ce4b3\") " pod="openshift-console/downloads-7954f5f757-mfhl4" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.129524 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b59e4a45-c4df-4754-8e3e-83e750aee402-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qtsxk\" (UID: \"b59e4a45-c4df-4754-8e3e-83e750aee402\") " pod="openshift-marketplace/marketplace-operator-79b997595-qtsxk" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.129542 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d79d2f7-de31-4ae6-b630-762dd6a5bef3-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-45mp5\" (UID: \"4d79d2f7-de31-4ae6-b630-762dd6a5bef3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-45mp5" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.129560 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/722a2a42-6f33-4002-ab72-3544df821d3b-encryption-config\") pod \"apiserver-76f77b778f-jkt7j\" (UID: \"722a2a42-6f33-4002-ab72-3544df821d3b\") " pod="openshift-apiserver/apiserver-76f77b778f-jkt7j" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.129578 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77902092-381a-451e-b462-de97532929fe-serving-cert\") pod \"apiserver-7bbb656c7d-fpttx\" (UID: \"77902092-381a-451e-b462-de97532929fe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fpttx" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.129596 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a-stats-auth\") pod \"router-default-5444994796-6xfhq\" (UID: \"fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a\") " pod="openshift-ingress/router-default-5444994796-6xfhq" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.129631 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwgsv\" (UniqueName: \"kubernetes.io/projected/eb3e9b2a-3512-4b19-90d9-24bdbeafbd54-kube-api-access-lwgsv\") pod \"kube-storage-version-migrator-operator-b67b599dd-fm8qg\" (UID: \"eb3e9b2a-3512-4b19-90d9-24bdbeafbd54\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fm8qg" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.129650 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/77902092-381a-451e-b462-de97532929fe-audit-policies\") pod \"apiserver-7bbb656c7d-fpttx\" (UID: \"77902092-381a-451e-b462-de97532929fe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fpttx" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.129668 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-942gl\" (UniqueName: \"kubernetes.io/projected/9b92581b-78c7-41ba-adac-0e8336d81508-kube-api-access-942gl\") pod \"cluster-image-registry-operator-dc59b4c8b-cr6pt\" (UID: \"9b92581b-78c7-41ba-adac-0e8336d81508\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cr6pt" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.129695 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw9x6\" (UniqueName: \"kubernetes.io/projected/160f98c9-f6b0-4d80-ab97-37487b03a785-kube-api-access-lw9x6\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.129712 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/722a2a42-6f33-4002-ab72-3544df821d3b-node-pullsecrets\") pod \"apiserver-76f77b778f-jkt7j\" (UID: \"722a2a42-6f33-4002-ab72-3544df821d3b\") " pod="openshift-apiserver/apiserver-76f77b778f-jkt7j" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.129731 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/722a2a42-6f33-4002-ab72-3544df821d3b-etcd-client\") pod \"apiserver-76f77b778f-jkt7j\" (UID: \"722a2a42-6f33-4002-ab72-3544df821d3b\") " pod="openshift-apiserver/apiserver-76f77b778f-jkt7j" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.129762 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/28604003-081b-4192-af14-bab8aab39d15-etcd-service-ca\") pod \"etcd-operator-b45778765-xkx2c\" (UID: \"28604003-081b-4192-af14-bab8aab39d15\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xkx2c" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.129781 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9fbcec9-73cc-4d46-88d5-3596af15db5b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vftjt\" (UID: \"a9fbcec9-73cc-4d46-88d5-3596af15db5b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vftjt" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.129819 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6070617-4dba-40fb-a6a8-03f6807a6846-config\") pod \"controller-manager-879f6c89f-zk6t7\" (UID: \"e6070617-4dba-40fb-a6a8-03f6807a6846\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zk6t7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.129837 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4de37de4-2c3b-4742-96fe-e63227dfc04b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-954gs\" (UID: \"4de37de4-2c3b-4742-96fe-e63227dfc04b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-954gs" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.129858 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vqbj\" (UniqueName: \"kubernetes.io/projected/b59e4a45-c4df-4754-8e3e-83e750aee402-kube-api-access-5vqbj\") pod \"marketplace-operator-79b997595-qtsxk\" (UID: \"b59e4a45-c4df-4754-8e3e-83e750aee402\") " pod="openshift-marketplace/marketplace-operator-79b997595-qtsxk" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.129875 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbjfd\" (UniqueName: \"kubernetes.io/projected/ebdfe0bc-aee2-4b9e-bea3-e226541a08ea-kube-api-access-tbjfd\") pod \"cluster-samples-operator-665b6dd947-z5zmm\" (UID: \"ebdfe0bc-aee2-4b9e-bea3-e226541a08ea\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z5zmm" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.129878 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/160f98c9-f6b0-4d80-ab97-37487b03a785-ca-trust-extracted\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.129911 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/722a2a42-6f33-4002-ab72-3544df821d3b-trusted-ca-bundle\") pod \"apiserver-76f77b778f-jkt7j\" (UID: \"722a2a42-6f33-4002-ab72-3544df821d3b\") " pod="openshift-apiserver/apiserver-76f77b778f-jkt7j" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.129928 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-ljxl7\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.129945 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m8nh\" (UniqueName: \"kubernetes.io/projected/e6051c5a-7df2-4c89-a0c8-03d2bd3c2cc2-kube-api-access-8m8nh\") pod \"ingress-canary-8fqr9\" (UID: \"e6051c5a-7df2-4c89-a0c8-03d2bd3c2cc2\") " pod="openshift-ingress-canary/ingress-canary-8fqr9" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.129982 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/28604003-081b-4192-af14-bab8aab39d15-etcd-client\") pod \"etcd-operator-b45778765-xkx2c\" (UID: \"28604003-081b-4192-af14-bab8aab39d15\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xkx2c" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.130001 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-ljxl7\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.130020 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxswl\" (UniqueName: \"kubernetes.io/projected/bf4132d7-ccbc-4e9a-adb2-2b92ee9165d8-kube-api-access-hxswl\") pod \"multus-admission-controller-857f4d67dd-7ghd2\" (UID: \"bf4132d7-ccbc-4e9a-adb2-2b92ee9165d8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7ghd2" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.130039 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a-default-certificate\") pod \"router-default-5444994796-6xfhq\" (UID: \"fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a\") " pod="openshift-ingress/router-default-5444994796-6xfhq" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.130055 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/722a2a42-6f33-4002-ab72-3544df821d3b-etcd-serving-ca\") pod \"apiserver-76f77b778f-jkt7j\" (UID: \"722a2a42-6f33-4002-ab72-3544df821d3b\") " pod="openshift-apiserver/apiserver-76f77b778f-jkt7j" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.130073 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e6070617-4dba-40fb-a6a8-03f6807a6846-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-zk6t7\" (UID: \"e6070617-4dba-40fb-a6a8-03f6807a6846\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zk6t7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.130117 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb13bcef-7d21-460c-851d-96eeb995f612-config-volume\") pod \"dns-default-hwbdb\" (UID: \"cb13bcef-7d21-460c-851d-96eeb995f612\") " pod="openshift-dns/dns-default-hwbdb" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.130135 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h996\" (UniqueName: \"kubernetes.io/projected/2537cd59-b6d6-46da-a99f-3f6306a19fdb-kube-api-access-7h996\") pod \"package-server-manager-789f6589d5-kbnrw\" (UID: \"2537cd59-b6d6-46da-a99f-3f6306a19fdb\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kbnrw" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.130153 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2eb914e0-ee72-482c-a8d0-ba6b8f9f95d1-apiservice-cert\") pod \"packageserver-d55dfcdfc-b2tld\" (UID: \"2eb914e0-ee72-482c-a8d0-ba6b8f9f95d1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b2tld" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.130170 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/28604003-081b-4192-af14-bab8aab39d15-etcd-ca\") pod \"etcd-operator-b45778765-xkx2c\" (UID: \"28604003-081b-4192-af14-bab8aab39d15\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xkx2c" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.130209 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmcrw\" (UniqueName: \"kubernetes.io/projected/fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a-kube-api-access-tmcrw\") pod \"router-default-5444994796-6xfhq\" (UID: \"fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a\") " pod="openshift-ingress/router-default-5444994796-6xfhq" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.130224 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28604003-081b-4192-af14-bab8aab39d15-config\") pod \"etcd-operator-b45778765-xkx2c\" (UID: \"28604003-081b-4192-af14-bab8aab39d15\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xkx2c" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.130243 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-ljxl7\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.130260 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9b92581b-78c7-41ba-adac-0e8336d81508-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-cr6pt\" (UID: \"9b92581b-78c7-41ba-adac-0e8336d81508\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cr6pt" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.130277 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpvqb\" (UniqueName: \"kubernetes.io/projected/22202c1d-bb44-4648-9e5b-d8a29a94003f-kube-api-access-xpvqb\") pod \"service-ca-9c57cc56f-h5twj\" (UID: \"22202c1d-bb44-4648-9e5b-d8a29a94003f\") " pod="openshift-service-ca/service-ca-9c57cc56f-h5twj" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.130286 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e6070617-4dba-40fb-a6a8-03f6807a6846-client-ca\") pod \"controller-manager-879f6c89f-zk6t7\" (UID: \"e6070617-4dba-40fb-a6a8-03f6807a6846\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zk6t7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.130296 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4de37de4-2c3b-4742-96fe-e63227dfc04b-metrics-tls\") pod \"ingress-operator-5b745b69d9-954gs\" (UID: \"4de37de4-2c3b-4742-96fe-e63227dfc04b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-954gs" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.130381 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjqnv\" (UniqueName: \"kubernetes.io/projected/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-kube-api-access-qjqnv\") pod \"oauth-openshift-558db77b4-ljxl7\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.130452 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2a98cf7d-1ef8-421e-85be-1bc5310adfdc-srv-cert\") pod \"catalog-operator-68c6474976-5bhmb\" (UID: \"2a98cf7d-1ef8-421e-85be-1bc5310adfdc\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5bhmb" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.130666 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/722a2a42-6f33-4002-ab72-3544df821d3b-node-pullsecrets\") pod \"apiserver-76f77b778f-jkt7j\" (UID: \"722a2a42-6f33-4002-ab72-3544df821d3b\") " pod="openshift-apiserver/apiserver-76f77b778f-jkt7j" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.131035 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/86bc5bcd-672b-4bb6-8090-1ed2d77fff50-images\") pod \"machine-api-operator-5694c8668f-krj4j\" (UID: \"86bc5bcd-672b-4bb6-8090-1ed2d77fff50\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-krj4j" Oct 02 09:34:47 crc kubenswrapper[4722]: E1002 09:34:47.131471 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:47.631438563 +0000 UTC m=+143.668191683 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.132186 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86bc5bcd-672b-4bb6-8090-1ed2d77fff50-config\") pod \"machine-api-operator-5694c8668f-krj4j\" (UID: \"86bc5bcd-672b-4bb6-8090-1ed2d77fff50\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-krj4j" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.136364 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-ljxl7\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.136433 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5c5d5270-f70e-4691-8bbd-ed6bbf5ac09d-images\") pod \"machine-config-operator-74547568cd-nhcxl\" (UID: \"5c5d5270-f70e-4691-8bbd-ed6bbf5ac09d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nhcxl" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.136506 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6898b032-cd5c-42cf-8ebf-df3fec127c5e-oauth-serving-cert\") pod \"console-f9d7485db-clnnl\" (UID: \"6898b032-cd5c-42cf-8ebf-df3fec127c5e\") " pod="openshift-console/console-f9d7485db-clnnl" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.136543 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn88h\" (UniqueName: \"kubernetes.io/projected/722a2a42-6f33-4002-ab72-3544df821d3b-kube-api-access-hn88h\") pod \"apiserver-76f77b778f-jkt7j\" (UID: \"722a2a42-6f33-4002-ab72-3544df821d3b\") " pod="openshift-apiserver/apiserver-76f77b778f-jkt7j" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.136594 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4de37de4-2c3b-4742-96fe-e63227dfc04b-trusted-ca\") pod \"ingress-operator-5b745b69d9-954gs\" (UID: \"4de37de4-2c3b-4742-96fe-e63227dfc04b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-954gs" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.136625 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/31485bfd-8f43-4046-a188-5b69955b5566-srv-cert\") pod \"olm-operator-6b444d44fb-422gm\" (UID: \"31485bfd-8f43-4046-a188-5b69955b5566\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-422gm" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.136662 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/160f98c9-f6b0-4d80-ab97-37487b03a785-installation-pull-secrets\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.136695 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/722a2a42-6f33-4002-ab72-3544df821d3b-audit-dir\") pod \"apiserver-76f77b778f-jkt7j\" (UID: \"722a2a42-6f33-4002-ab72-3544df821d3b\") " pod="openshift-apiserver/apiserver-76f77b778f-jkt7j" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.136724 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcfm7\" (UniqueName: \"kubernetes.io/projected/2eb914e0-ee72-482c-a8d0-ba6b8f9f95d1-kube-api-access-wcfm7\") pod \"packageserver-d55dfcdfc-b2tld\" (UID: \"2eb914e0-ee72-482c-a8d0-ba6b8f9f95d1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b2tld" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.136753 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/722a2a42-6f33-4002-ab72-3544df821d3b-audit\") pod \"apiserver-76f77b778f-jkt7j\" (UID: \"722a2a42-6f33-4002-ab72-3544df821d3b\") " pod="openshift-apiserver/apiserver-76f77b778f-jkt7j" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.136773 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbjf4\" (UniqueName: \"kubernetes.io/projected/6898b032-cd5c-42cf-8ebf-df3fec127c5e-kube-api-access-sbjf4\") pod \"console-f9d7485db-clnnl\" (UID: \"6898b032-cd5c-42cf-8ebf-df3fec127c5e\") " pod="openshift-console/console-f9d7485db-clnnl" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.136801 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-ljxl7\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.136842 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-ljxl7\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.136890 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/9b22e0a2-1fea-4571-ad2b-e639bed4832c-node-bootstrap-token\") pod \"machine-config-server-l7mvw\" (UID: \"9b22e0a2-1fea-4571-ad2b-e639bed4832c\") " pod="openshift-machine-config-operator/machine-config-server-l7mvw" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.136917 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5c5d5270-f70e-4691-8bbd-ed6bbf5ac09d-proxy-tls\") pod \"machine-config-operator-74547568cd-nhcxl\" (UID: \"5c5d5270-f70e-4691-8bbd-ed6bbf5ac09d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nhcxl" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.136956 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n65hq\" (UniqueName: \"kubernetes.io/projected/92d2f480-8dcd-483c-ba02-41a2027b12ae-kube-api-access-n65hq\") pod \"service-ca-operator-777779d784-c5szj\" (UID: \"92d2f480-8dcd-483c-ba02-41a2027b12ae\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c5szj" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.136994 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77902092-381a-451e-b462-de97532929fe-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-fpttx\" (UID: \"77902092-381a-451e-b462-de97532929fe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fpttx" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.137028 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-ljxl7\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.137053 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92d2f480-8dcd-483c-ba02-41a2027b12ae-config\") pod \"service-ca-operator-777779d784-c5szj\" (UID: \"92d2f480-8dcd-483c-ba02-41a2027b12ae\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c5szj" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.137090 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzsbj\" (UniqueName: \"kubernetes.io/projected/5c5d5270-f70e-4691-8bbd-ed6bbf5ac09d-kube-api-access-mzsbj\") pod \"machine-config-operator-74547568cd-nhcxl\" (UID: \"5c5d5270-f70e-4691-8bbd-ed6bbf5ac09d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nhcxl" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.137128 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9db8e859-4683-4df7-87e2-b84a8113cb35-config\") pod \"authentication-operator-69f744f599-dgwxp\" (UID: \"9db8e859-4683-4df7-87e2-b84a8113cb35\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dgwxp" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.137161 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84cae530-6db3-4e7e-bf5f-b4e25ce9d992-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nvrnd\" (UID: \"84cae530-6db3-4e7e-bf5f-b4e25ce9d992\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nvrnd" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.137302 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6898b032-cd5c-42cf-8ebf-df3fec127c5e-service-ca\") pod \"console-f9d7485db-clnnl\" (UID: \"6898b032-cd5c-42cf-8ebf-df3fec127c5e\") " pod="openshift-console/console-f9d7485db-clnnl" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.137913 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/160f98c9-f6b0-4d80-ab97-37487b03a785-trusted-ca\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.138810 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a-service-ca-bundle\") pod \"router-default-5444994796-6xfhq\" (UID: \"fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a\") " pod="openshift-ingress/router-default-5444994796-6xfhq" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.139009 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/160f98c9-f6b0-4d80-ab97-37487b03a785-registry-tls\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.139875 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6070617-4dba-40fb-a6a8-03f6807a6846-config\") pod \"controller-manager-879f6c89f-zk6t7\" (UID: \"e6070617-4dba-40fb-a6a8-03f6807a6846\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zk6t7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.132222 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9db8e859-4683-4df7-87e2-b84a8113cb35-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dgwxp\" (UID: \"9db8e859-4683-4df7-87e2-b84a8113cb35\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dgwxp" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.140165 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-ljxl7\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.140458 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4de37de4-2c3b-4742-96fe-e63227dfc04b-metrics-tls\") pod \"ingress-operator-5b745b69d9-954gs\" (UID: \"4de37de4-2c3b-4742-96fe-e63227dfc04b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-954gs" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.140995 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/77902092-381a-451e-b462-de97532929fe-audit-dir\") pod \"apiserver-7bbb656c7d-fpttx\" (UID: \"77902092-381a-451e-b462-de97532929fe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fpttx" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.142867 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9b92581b-78c7-41ba-adac-0e8336d81508-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-cr6pt\" (UID: \"9b92581b-78c7-41ba-adac-0e8336d81508\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cr6pt" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.142885 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6898b032-cd5c-42cf-8ebf-df3fec127c5e-console-oauth-config\") pod \"console-f9d7485db-clnnl\" (UID: \"6898b032-cd5c-42cf-8ebf-df3fec127c5e\") " pod="openshift-console/console-f9d7485db-clnnl" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.144060 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a-metrics-certs\") pod \"router-default-5444994796-6xfhq\" (UID: \"fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a\") " pod="openshift-ingress/router-default-5444994796-6xfhq" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.155934 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb3e9b2a-3512-4b19-90d9-24bdbeafbd54-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-fm8qg\" (UID: \"eb3e9b2a-3512-4b19-90d9-24bdbeafbd54\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fm8qg" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.155991 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb3e9b2a-3512-4b19-90d9-24bdbeafbd54-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-fm8qg\" (UID: \"eb3e9b2a-3512-4b19-90d9-24bdbeafbd54\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fm8qg" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.156027 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p96n2\" (UniqueName: \"kubernetes.io/projected/cb13bcef-7d21-460c-851d-96eeb995f612-kube-api-access-p96n2\") pod \"dns-default-hwbdb\" (UID: \"cb13bcef-7d21-460c-851d-96eeb995f612\") " pod="openshift-dns/dns-default-hwbdb" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.156061 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9365ecd-353b-4623-a174-45d558e9d16d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-d4656\" (UID: \"f9365ecd-353b-4623-a174-45d558e9d16d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d4656" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.156095 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkwl4\" (UniqueName: \"kubernetes.io/projected/f9365ecd-353b-4623-a174-45d558e9d16d-kube-api-access-xkwl4\") pod \"openshift-controller-manager-operator-756b6f6bc6-d4656\" (UID: \"f9365ecd-353b-4623-a174-45d558e9d16d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d4656" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.156141 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/77902092-381a-451e-b462-de97532929fe-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-fpttx\" (UID: \"77902092-381a-451e-b462-de97532929fe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fpttx" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.156176 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-ljxl7\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.156205 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9fbcec9-73cc-4d46-88d5-3596af15db5b-config\") pod \"kube-controller-manager-operator-78b949d7b-vftjt\" (UID: \"a9fbcec9-73cc-4d46-88d5-3596af15db5b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vftjt" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.156235 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ddbb576c-7cc0-4b9e-89b2-6ca265b6aa6b-proxy-tls\") pod \"machine-config-controller-84d6567774-42lsk\" (UID: \"ddbb576c-7cc0-4b9e-89b2-6ca265b6aa6b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-42lsk" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.156266 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2gv4\" (UniqueName: \"kubernetes.io/projected/e6070617-4dba-40fb-a6a8-03f6807a6846-kube-api-access-g2gv4\") pod \"controller-manager-879f6c89f-zk6t7\" (UID: \"e6070617-4dba-40fb-a6a8-03f6807a6846\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zk6t7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.156297 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/77902092-381a-451e-b462-de97532929fe-encryption-config\") pod \"apiserver-7bbb656c7d-fpttx\" (UID: \"77902092-381a-451e-b462-de97532929fe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fpttx" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.156337 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/722a2a42-6f33-4002-ab72-3544df821d3b-config\") pod \"apiserver-76f77b778f-jkt7j\" (UID: \"722a2a42-6f33-4002-ab72-3544df821d3b\") " pod="openshift-apiserver/apiserver-76f77b778f-jkt7j" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.157203 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/722a2a42-6f33-4002-ab72-3544df821d3b-config\") pod \"apiserver-76f77b778f-jkt7j\" (UID: \"722a2a42-6f33-4002-ab72-3544df821d3b\") " pod="openshift-apiserver/apiserver-76f77b778f-jkt7j" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.157406 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9db8e859-4683-4df7-87e2-b84a8113cb35-config\") pod \"authentication-operator-69f744f599-dgwxp\" (UID: \"9db8e859-4683-4df7-87e2-b84a8113cb35\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dgwxp" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.158333 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-ljxl7\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.158748 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.158873 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/722a2a42-6f33-4002-ab72-3544df821d3b-audit\") pod \"apiserver-76f77b778f-jkt7j\" (UID: \"722a2a42-6f33-4002-ab72-3544df821d3b\") " pod="openshift-apiserver/apiserver-76f77b778f-jkt7j" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.159977 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/722a2a42-6f33-4002-ab72-3544df821d3b-image-import-ca\") pod \"apiserver-76f77b778f-jkt7j\" (UID: \"722a2a42-6f33-4002-ab72-3544df821d3b\") " pod="openshift-apiserver/apiserver-76f77b778f-jkt7j" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.160083 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/160f98c9-f6b0-4d80-ab97-37487b03a785-registry-certificates\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.162614 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-audit-policies\") pod \"oauth-openshift-558db77b4-ljxl7\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.163586 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/722a2a42-6f33-4002-ab72-3544df821d3b-trusted-ca-bundle\") pod \"apiserver-76f77b778f-jkt7j\" (UID: \"722a2a42-6f33-4002-ab72-3544df821d3b\") " pod="openshift-apiserver/apiserver-76f77b778f-jkt7j" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.163780 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6898b032-cd5c-42cf-8ebf-df3fec127c5e-console-config\") pod \"console-f9d7485db-clnnl\" (UID: \"6898b032-cd5c-42cf-8ebf-df3fec127c5e\") " pod="openshift-console/console-f9d7485db-clnnl" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.163973 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/77902092-381a-451e-b462-de97532929fe-audit-policies\") pod \"apiserver-7bbb656c7d-fpttx\" (UID: \"77902092-381a-451e-b462-de97532929fe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fpttx" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.164024 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-ljxl7\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.164557 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9db8e859-4683-4df7-87e2-b84a8113cb35-service-ca-bundle\") pod \"authentication-operator-69f744f599-dgwxp\" (UID: \"9db8e859-4683-4df7-87e2-b84a8113cb35\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dgwxp" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.164603 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6898b032-cd5c-42cf-8ebf-df3fec127c5e-trusted-ca-bundle\") pod \"console-f9d7485db-clnnl\" (UID: \"6898b032-cd5c-42cf-8ebf-df3fec127c5e\") " pod="openshift-console/console-f9d7485db-clnnl" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.164790 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77902092-381a-451e-b462-de97532929fe-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-fpttx\" (UID: \"77902092-381a-451e-b462-de97532929fe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fpttx" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.166295 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e6070617-4dba-40fb-a6a8-03f6807a6846-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-zk6t7\" (UID: \"e6070617-4dba-40fb-a6a8-03f6807a6846\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zk6t7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.166432 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4de37de4-2c3b-4742-96fe-e63227dfc04b-trusted-ca\") pod \"ingress-operator-5b745b69d9-954gs\" (UID: \"4de37de4-2c3b-4742-96fe-e63227dfc04b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-954gs" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.167281 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/722a2a42-6f33-4002-ab72-3544df821d3b-serving-cert\") pod \"apiserver-76f77b778f-jkt7j\" (UID: \"722a2a42-6f33-4002-ab72-3544df821d3b\") " pod="openshift-apiserver/apiserver-76f77b778f-jkt7j" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.167617 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/722a2a42-6f33-4002-ab72-3544df821d3b-etcd-serving-ca\") pod \"apiserver-76f77b778f-jkt7j\" (UID: \"722a2a42-6f33-4002-ab72-3544df821d3b\") " pod="openshift-apiserver/apiserver-76f77b778f-jkt7j" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.168204 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a-stats-auth\") pod \"router-default-5444994796-6xfhq\" (UID: \"fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a\") " pod="openshift-ingress/router-default-5444994796-6xfhq" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.168404 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-audit-dir\") pod \"oauth-openshift-558db77b4-ljxl7\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.168555 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-ljxl7\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.171038 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b9q7t" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.173531 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-ljxl7\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.174640 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-ljxl7\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.175557 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-ljxl7\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.175697 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/722a2a42-6f33-4002-ab72-3544df821d3b-audit-dir\") pod \"apiserver-76f77b778f-jkt7j\" (UID: \"722a2a42-6f33-4002-ab72-3544df821d3b\") " pod="openshift-apiserver/apiserver-76f77b778f-jkt7j" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.176033 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b92581b-78c7-41ba-adac-0e8336d81508-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-cr6pt\" (UID: \"9b92581b-78c7-41ba-adac-0e8336d81508\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cr6pt" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.183648 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9db8e859-4683-4df7-87e2-b84a8113cb35-serving-cert\") pod \"authentication-operator-69f744f599-dgwxp\" (UID: \"9db8e859-4683-4df7-87e2-b84a8113cb35\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dgwxp" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.184422 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/77902092-381a-451e-b462-de97532929fe-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-fpttx\" (UID: \"77902092-381a-451e-b462-de97532929fe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fpttx" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.193203 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/722a2a42-6f33-4002-ab72-3544df821d3b-encryption-config\") pod \"apiserver-76f77b778f-jkt7j\" (UID: \"722a2a42-6f33-4002-ab72-3544df821d3b\") " pod="openshift-apiserver/apiserver-76f77b778f-jkt7j" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.193546 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-ljxl7\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.193912 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77902092-381a-451e-b462-de97532929fe-serving-cert\") pod \"apiserver-7bbb656c7d-fpttx\" (UID: \"77902092-381a-451e-b462-de97532929fe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fpttx" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.194167 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/77902092-381a-451e-b462-de97532929fe-encryption-config\") pod \"apiserver-7bbb656c7d-fpttx\" (UID: \"77902092-381a-451e-b462-de97532929fe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fpttx" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.194381 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/86bc5bcd-672b-4bb6-8090-1ed2d77fff50-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-krj4j\" (UID: \"86bc5bcd-672b-4bb6-8090-1ed2d77fff50\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-krj4j" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.195097 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6898b032-cd5c-42cf-8ebf-df3fec127c5e-console-serving-cert\") pod \"console-f9d7485db-clnnl\" (UID: \"6898b032-cd5c-42cf-8ebf-df3fec127c5e\") " pod="openshift-console/console-f9d7485db-clnnl" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.195724 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6070617-4dba-40fb-a6a8-03f6807a6846-serving-cert\") pod \"controller-manager-879f6c89f-zk6t7\" (UID: \"e6070617-4dba-40fb-a6a8-03f6807a6846\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zk6t7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.195799 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a-default-certificate\") pod \"router-default-5444994796-6xfhq\" (UID: \"fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a\") " pod="openshift-ingress/router-default-5444994796-6xfhq" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.196297 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-ljxl7\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.196980 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/160f98c9-f6b0-4d80-ab97-37487b03a785-installation-pull-secrets\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.197226 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-ljxl7\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.197385 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/722a2a42-6f33-4002-ab72-3544df821d3b-etcd-client\") pod \"apiserver-76f77b778f-jkt7j\" (UID: \"722a2a42-6f33-4002-ab72-3544df821d3b\") " pod="openshift-apiserver/apiserver-76f77b778f-jkt7j" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.197438 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/77902092-381a-451e-b462-de97532929fe-etcd-client\") pod \"apiserver-7bbb656c7d-fpttx\" (UID: \"77902092-381a-451e-b462-de97532929fe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fpttx" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.197470 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-ljxl7\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.197693 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ebdfe0bc-aee2-4b9e-bea3-e226541a08ea-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-z5zmm\" (UID: \"ebdfe0bc-aee2-4b9e-bea3-e226541a08ea\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z5zmm" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.199067 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.208356 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.225765 4722 request.go:700] Waited for 1.895474436s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns/secrets?fieldSelector=metadata.name%3Ddns-default-metrics-tls&limit=500&resourceVersion=0 Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.227189 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.254885 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-s727n"] Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.258057 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.258272 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vqbj\" (UniqueName: \"kubernetes.io/projected/b59e4a45-c4df-4754-8e3e-83e750aee402-kube-api-access-5vqbj\") pod \"marketplace-operator-79b997595-qtsxk\" (UID: \"b59e4a45-c4df-4754-8e3e-83e750aee402\") " pod="openshift-marketplace/marketplace-operator-79b997595-qtsxk" Oct 02 09:34:47 crc kubenswrapper[4722]: E1002 09:34:47.258424 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:47.758370987 +0000 UTC m=+143.795124097 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.258539 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8m8nh\" (UniqueName: \"kubernetes.io/projected/e6051c5a-7df2-4c89-a0c8-03d2bd3c2cc2-kube-api-access-8m8nh\") pod \"ingress-canary-8fqr9\" (UID: \"e6051c5a-7df2-4c89-a0c8-03d2bd3c2cc2\") " pod="openshift-ingress-canary/ingress-canary-8fqr9" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.258581 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxswl\" (UniqueName: \"kubernetes.io/projected/bf4132d7-ccbc-4e9a-adb2-2b92ee9165d8-kube-api-access-hxswl\") pod \"multus-admission-controller-857f4d67dd-7ghd2\" (UID: \"bf4132d7-ccbc-4e9a-adb2-2b92ee9165d8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7ghd2" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.258622 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/28604003-081b-4192-af14-bab8aab39d15-etcd-client\") pod \"etcd-operator-b45778765-xkx2c\" (UID: \"28604003-081b-4192-af14-bab8aab39d15\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xkx2c" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.258662 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb13bcef-7d21-460c-851d-96eeb995f612-config-volume\") pod \"dns-default-hwbdb\" (UID: \"cb13bcef-7d21-460c-851d-96eeb995f612\") " pod="openshift-dns/dns-default-hwbdb" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.258698 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h996\" (UniqueName: \"kubernetes.io/projected/2537cd59-b6d6-46da-a99f-3f6306a19fdb-kube-api-access-7h996\") pod \"package-server-manager-789f6589d5-kbnrw\" (UID: \"2537cd59-b6d6-46da-a99f-3f6306a19fdb\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kbnrw" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.258727 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2eb914e0-ee72-482c-a8d0-ba6b8f9f95d1-apiservice-cert\") pod \"packageserver-d55dfcdfc-b2tld\" (UID: \"2eb914e0-ee72-482c-a8d0-ba6b8f9f95d1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b2tld" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.258755 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/28604003-081b-4192-af14-bab8aab39d15-etcd-ca\") pod \"etcd-operator-b45778765-xkx2c\" (UID: \"28604003-081b-4192-af14-bab8aab39d15\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xkx2c" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.258807 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28604003-081b-4192-af14-bab8aab39d15-config\") pod \"etcd-operator-b45778765-xkx2c\" (UID: \"28604003-081b-4192-af14-bab8aab39d15\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xkx2c" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.258844 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpvqb\" (UniqueName: \"kubernetes.io/projected/22202c1d-bb44-4648-9e5b-d8a29a94003f-kube-api-access-xpvqb\") pod \"service-ca-9c57cc56f-h5twj\" (UID: \"22202c1d-bb44-4648-9e5b-d8a29a94003f\") " pod="openshift-service-ca/service-ca-9c57cc56f-h5twj" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.258883 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5c5d5270-f70e-4691-8bbd-ed6bbf5ac09d-images\") pod \"machine-config-operator-74547568cd-nhcxl\" (UID: \"5c5d5270-f70e-4691-8bbd-ed6bbf5ac09d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nhcxl" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.258909 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2a98cf7d-1ef8-421e-85be-1bc5310adfdc-srv-cert\") pod \"catalog-operator-68c6474976-5bhmb\" (UID: \"2a98cf7d-1ef8-421e-85be-1bc5310adfdc\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5bhmb" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.258957 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/31485bfd-8f43-4046-a188-5b69955b5566-srv-cert\") pod \"olm-operator-6b444d44fb-422gm\" (UID: \"31485bfd-8f43-4046-a188-5b69955b5566\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-422gm" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.258990 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcfm7\" (UniqueName: \"kubernetes.io/projected/2eb914e0-ee72-482c-a8d0-ba6b8f9f95d1-kube-api-access-wcfm7\") pod \"packageserver-d55dfcdfc-b2tld\" (UID: \"2eb914e0-ee72-482c-a8d0-ba6b8f9f95d1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b2tld" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.259038 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5c5d5270-f70e-4691-8bbd-ed6bbf5ac09d-proxy-tls\") pod \"machine-config-operator-74547568cd-nhcxl\" (UID: \"5c5d5270-f70e-4691-8bbd-ed6bbf5ac09d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nhcxl" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.259063 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n65hq\" (UniqueName: \"kubernetes.io/projected/92d2f480-8dcd-483c-ba02-41a2027b12ae-kube-api-access-n65hq\") pod \"service-ca-operator-777779d784-c5szj\" (UID: \"92d2f480-8dcd-483c-ba02-41a2027b12ae\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c5szj" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.259090 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/9b22e0a2-1fea-4571-ad2b-e639bed4832c-node-bootstrap-token\") pod \"machine-config-server-l7mvw\" (UID: \"9b22e0a2-1fea-4571-ad2b-e639bed4832c\") " pod="openshift-machine-config-operator/machine-config-server-l7mvw" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.259123 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92d2f480-8dcd-483c-ba02-41a2027b12ae-config\") pod \"service-ca-operator-777779d784-c5szj\" (UID: \"92d2f480-8dcd-483c-ba02-41a2027b12ae\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c5szj" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.259153 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzsbj\" (UniqueName: \"kubernetes.io/projected/5c5d5270-f70e-4691-8bbd-ed6bbf5ac09d-kube-api-access-mzsbj\") pod \"machine-config-operator-74547568cd-nhcxl\" (UID: \"5c5d5270-f70e-4691-8bbd-ed6bbf5ac09d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nhcxl" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.259181 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p96n2\" (UniqueName: \"kubernetes.io/projected/cb13bcef-7d21-460c-851d-96eeb995f612-kube-api-access-p96n2\") pod \"dns-default-hwbdb\" (UID: \"cb13bcef-7d21-460c-851d-96eeb995f612\") " pod="openshift-dns/dns-default-hwbdb" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.259218 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84cae530-6db3-4e7e-bf5f-b4e25ce9d992-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nvrnd\" (UID: \"84cae530-6db3-4e7e-bf5f-b4e25ce9d992\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nvrnd" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.259246 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb3e9b2a-3512-4b19-90d9-24bdbeafbd54-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-fm8qg\" (UID: \"eb3e9b2a-3512-4b19-90d9-24bdbeafbd54\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fm8qg" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.259269 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb3e9b2a-3512-4b19-90d9-24bdbeafbd54-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-fm8qg\" (UID: \"eb3e9b2a-3512-4b19-90d9-24bdbeafbd54\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fm8qg" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.259305 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9365ecd-353b-4623-a174-45d558e9d16d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-d4656\" (UID: \"f9365ecd-353b-4623-a174-45d558e9d16d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d4656" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.259346 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkwl4\" (UniqueName: \"kubernetes.io/projected/f9365ecd-353b-4623-a174-45d558e9d16d-kube-api-access-xkwl4\") pod \"openshift-controller-manager-operator-756b6f6bc6-d4656\" (UID: \"f9365ecd-353b-4623-a174-45d558e9d16d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d4656" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.259375 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9fbcec9-73cc-4d46-88d5-3596af15db5b-config\") pod \"kube-controller-manager-operator-78b949d7b-vftjt\" (UID: \"a9fbcec9-73cc-4d46-88d5-3596af15db5b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vftjt" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.259405 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ddbb576c-7cc0-4b9e-89b2-6ca265b6aa6b-proxy-tls\") pod \"machine-config-controller-84d6567774-42lsk\" (UID: \"ddbb576c-7cc0-4b9e-89b2-6ca265b6aa6b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-42lsk" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.259437 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb13bcef-7d21-460c-851d-96eeb995f612-metrics-tls\") pod \"dns-default-hwbdb\" (UID: \"cb13bcef-7d21-460c-851d-96eeb995f612\") " pod="openshift-dns/dns-default-hwbdb" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.259463 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/22202c1d-bb44-4648-9e5b-d8a29a94003f-signing-key\") pod \"service-ca-9c57cc56f-h5twj\" (UID: \"22202c1d-bb44-4648-9e5b-d8a29a94003f\") " pod="openshift-service-ca/service-ca-9c57cc56f-h5twj" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.259501 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bf4132d7-ccbc-4e9a-adb2-2b92ee9165d8-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-7ghd2\" (UID: \"bf4132d7-ccbc-4e9a-adb2-2b92ee9165d8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7ghd2" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.259526 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mk8bv\" (UniqueName: \"kubernetes.io/projected/d30e5a09-5492-426b-89a9-2331cce3cc87-kube-api-access-mk8bv\") pod \"collect-profiles-29323290-fh7nk\" (UID: \"d30e5a09-5492-426b-89a9-2331cce3cc87\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323290-fh7nk" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.259557 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2186d01d-5c96-4c50-9414-bce7cd2d7052-socket-dir\") pod \"csi-hostpathplugin-h645w\" (UID: \"2186d01d-5c96-4c50-9414-bce7cd2d7052\") " pod="hostpath-provisioner/csi-hostpathplugin-h645w" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.259578 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8kvn\" (UniqueName: \"kubernetes.io/projected/9b22e0a2-1fea-4571-ad2b-e639bed4832c-kube-api-access-g8kvn\") pod \"machine-config-server-l7mvw\" (UID: \"9b22e0a2-1fea-4571-ad2b-e639bed4832c\") " pod="openshift-machine-config-operator/machine-config-server-l7mvw" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.259617 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/2186d01d-5c96-4c50-9414-bce7cd2d7052-csi-data-dir\") pod \"csi-hostpathplugin-h645w\" (UID: \"2186d01d-5c96-4c50-9414-bce7cd2d7052\") " pod="hostpath-provisioner/csi-hostpathplugin-h645w" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.259639 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2a98cf7d-1ef8-421e-85be-1bc5310adfdc-profile-collector-cert\") pod \"catalog-operator-68c6474976-5bhmb\" (UID: \"2a98cf7d-1ef8-421e-85be-1bc5310adfdc\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5bhmb" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.259666 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bnd6\" (UniqueName: \"kubernetes.io/projected/ddbb576c-7cc0-4b9e-89b2-6ca265b6aa6b-kube-api-access-2bnd6\") pod \"machine-config-controller-84d6567774-42lsk\" (UID: \"ddbb576c-7cc0-4b9e-89b2-6ca265b6aa6b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-42lsk" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.259693 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28604003-081b-4192-af14-bab8aab39d15-serving-cert\") pod \"etcd-operator-b45778765-xkx2c\" (UID: \"28604003-081b-4192-af14-bab8aab39d15\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xkx2c" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.259721 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxhrx\" (UniqueName: \"kubernetes.io/projected/2186d01d-5c96-4c50-9414-bce7cd2d7052-kube-api-access-zxhrx\") pod \"csi-hostpathplugin-h645w\" (UID: \"2186d01d-5c96-4c50-9414-bce7cd2d7052\") " pod="hostpath-provisioner/csi-hostpathplugin-h645w" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.259751 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2537cd59-b6d6-46da-a99f-3f6306a19fdb-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-kbnrw\" (UID: \"2537cd59-b6d6-46da-a99f-3f6306a19fdb\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kbnrw" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.259784 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/2186d01d-5c96-4c50-9414-bce7cd2d7052-plugins-dir\") pod \"csi-hostpathplugin-h645w\" (UID: \"2186d01d-5c96-4c50-9414-bce7cd2d7052\") " pod="hostpath-provisioner/csi-hostpathplugin-h645w" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.259810 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/2eb914e0-ee72-482c-a8d0-ba6b8f9f95d1-tmpfs\") pod \"packageserver-d55dfcdfc-b2tld\" (UID: \"2eb914e0-ee72-482c-a8d0-ba6b8f9f95d1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b2tld" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.259841 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qk2t2\" (UniqueName: \"kubernetes.io/projected/31485bfd-8f43-4046-a188-5b69955b5566-kube-api-access-qk2t2\") pod \"olm-operator-6b444d44fb-422gm\" (UID: \"31485bfd-8f43-4046-a188-5b69955b5566\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-422gm" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.259865 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a9fbcec9-73cc-4d46-88d5-3596af15db5b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vftjt\" (UID: \"a9fbcec9-73cc-4d46-88d5-3596af15db5b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vftjt" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.259896 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf859e5d-c395-4892-9abf-e1a9a9d21176-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-hpfrs\" (UID: \"bf859e5d-c395-4892-9abf-e1a9a9d21176\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hpfrs" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.259926 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b59e4a45-c4df-4754-8e3e-83e750aee402-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qtsxk\" (UID: \"b59e4a45-c4df-4754-8e3e-83e750aee402\") " pod="openshift-marketplace/marketplace-operator-79b997595-qtsxk" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.259951 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92d2f480-8dcd-483c-ba02-41a2027b12ae-serving-cert\") pod \"service-ca-operator-777779d784-c5szj\" (UID: \"92d2f480-8dcd-483c-ba02-41a2027b12ae\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c5szj" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.259975 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5fcdab41-6e2e-46fe-b9d0-1411949a51e9-metrics-tls\") pod \"dns-operator-744455d44c-xpzp4\" (UID: \"5fcdab41-6e2e-46fe-b9d0-1411949a51e9\") " pod="openshift-dns-operator/dns-operator-744455d44c-xpzp4" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.260037 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d30e5a09-5492-426b-89a9-2331cce3cc87-secret-volume\") pod \"collect-profiles-29323290-fh7nk\" (UID: \"d30e5a09-5492-426b-89a9-2331cce3cc87\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323290-fh7nk" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.260067 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v92ss\" (UniqueName: \"kubernetes.io/projected/28604003-081b-4192-af14-bab8aab39d15-kube-api-access-v92ss\") pod \"etcd-operator-b45778765-xkx2c\" (UID: \"28604003-081b-4192-af14-bab8aab39d15\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xkx2c" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.260092 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ddbb576c-7cc0-4b9e-89b2-6ca265b6aa6b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-42lsk\" (UID: \"ddbb576c-7cc0-4b9e-89b2-6ca265b6aa6b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-42lsk" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.260128 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5c5d5270-f70e-4691-8bbd-ed6bbf5ac09d-auth-proxy-config\") pod \"machine-config-operator-74547568cd-nhcxl\" (UID: \"5c5d5270-f70e-4691-8bbd-ed6bbf5ac09d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nhcxl" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.260175 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wns8\" (UniqueName: \"kubernetes.io/projected/69b26642-0e96-47a5-9154-9866670cb9e4-kube-api-access-7wns8\") pod \"migrator-59844c95c7-87568\" (UID: \"69b26642-0e96-47a5-9154-9866670cb9e4\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-87568" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.260223 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2186d01d-5c96-4c50-9414-bce7cd2d7052-registration-dir\") pod \"csi-hostpathplugin-h645w\" (UID: \"2186d01d-5c96-4c50-9414-bce7cd2d7052\") " pod="hostpath-provisioner/csi-hostpathplugin-h645w" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.260249 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d30e5a09-5492-426b-89a9-2331cce3cc87-config-volume\") pod \"collect-profiles-29323290-fh7nk\" (UID: \"d30e5a09-5492-426b-89a9-2331cce3cc87\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323290-fh7nk" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.260274 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d79d2f7-de31-4ae6-b630-762dd6a5bef3-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-45mp5\" (UID: \"4d79d2f7-de31-4ae6-b630-762dd6a5bef3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-45mp5" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.260300 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d79d2f7-de31-4ae6-b630-762dd6a5bef3-config\") pod \"kube-apiserver-operator-766d6c64bb-45mp5\" (UID: \"4d79d2f7-de31-4ae6-b630-762dd6a5bef3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-45mp5" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.260345 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6kxc\" (UniqueName: \"kubernetes.io/projected/5fcdab41-6e2e-46fe-b9d0-1411949a51e9-kube-api-access-k6kxc\") pod \"dns-operator-744455d44c-xpzp4\" (UID: \"5fcdab41-6e2e-46fe-b9d0-1411949a51e9\") " pod="openshift-dns-operator/dns-operator-744455d44c-xpzp4" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.260370 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/84cae530-6db3-4e7e-bf5f-b4e25ce9d992-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nvrnd\" (UID: \"84cae530-6db3-4e7e-bf5f-b4e25ce9d992\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nvrnd" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.260397 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9365ecd-353b-4623-a174-45d558e9d16d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-d4656\" (UID: \"f9365ecd-353b-4623-a174-45d558e9d16d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d4656" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.260421 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcxxs\" (UniqueName: \"kubernetes.io/projected/2a98cf7d-1ef8-421e-85be-1bc5310adfdc-kube-api-access-lcxxs\") pod \"catalog-operator-68c6474976-5bhmb\" (UID: \"2a98cf7d-1ef8-421e-85be-1bc5310adfdc\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5bhmb" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.260447 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/2186d01d-5c96-4c50-9414-bce7cd2d7052-mountpoint-dir\") pod \"csi-hostpathplugin-h645w\" (UID: \"2186d01d-5c96-4c50-9414-bce7cd2d7052\") " pod="hostpath-provisioner/csi-hostpathplugin-h645w" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.260487 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e6051c5a-7df2-4c89-a0c8-03d2bd3c2cc2-cert\") pod \"ingress-canary-8fqr9\" (UID: \"e6051c5a-7df2-4c89-a0c8-03d2bd3c2cc2\") " pod="openshift-ingress-canary/ingress-canary-8fqr9" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.260511 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/31485bfd-8f43-4046-a188-5b69955b5566-profile-collector-cert\") pod \"olm-operator-6b444d44fb-422gm\" (UID: \"31485bfd-8f43-4046-a188-5b69955b5566\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-422gm" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.260537 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/22202c1d-bb44-4648-9e5b-d8a29a94003f-signing-cabundle\") pod \"service-ca-9c57cc56f-h5twj\" (UID: \"22202c1d-bb44-4648-9e5b-d8a29a94003f\") " pod="openshift-service-ca/service-ca-9c57cc56f-h5twj" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.260567 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2eb914e0-ee72-482c-a8d0-ba6b8f9f95d1-webhook-cert\") pod \"packageserver-d55dfcdfc-b2tld\" (UID: \"2eb914e0-ee72-482c-a8d0-ba6b8f9f95d1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b2tld" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.260615 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.260650 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mblch\" (UniqueName: \"kubernetes.io/projected/bf859e5d-c395-4892-9abf-e1a9a9d21176-kube-api-access-mblch\") pod \"control-plane-machine-set-operator-78cbb6b69f-hpfrs\" (UID: \"bf859e5d-c395-4892-9abf-e1a9a9d21176\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hpfrs" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.260685 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84cae530-6db3-4e7e-bf5f-b4e25ce9d992-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nvrnd\" (UID: \"84cae530-6db3-4e7e-bf5f-b4e25ce9d992\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nvrnd" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.260710 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/9b22e0a2-1fea-4571-ad2b-e639bed4832c-certs\") pod \"machine-config-server-l7mvw\" (UID: \"9b22e0a2-1fea-4571-ad2b-e639bed4832c\") " pod="openshift-machine-config-operator/machine-config-server-l7mvw" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.260758 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b59e4a45-c4df-4754-8e3e-83e750aee402-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qtsxk\" (UID: \"b59e4a45-c4df-4754-8e3e-83e750aee402\") " pod="openshift-marketplace/marketplace-operator-79b997595-qtsxk" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.260781 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d79d2f7-de31-4ae6-b630-762dd6a5bef3-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-45mp5\" (UID: \"4d79d2f7-de31-4ae6-b630-762dd6a5bef3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-45mp5" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.260838 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwgsv\" (UniqueName: \"kubernetes.io/projected/eb3e9b2a-3512-4b19-90d9-24bdbeafbd54-kube-api-access-lwgsv\") pod \"kube-storage-version-migrator-operator-b67b599dd-fm8qg\" (UID: \"eb3e9b2a-3512-4b19-90d9-24bdbeafbd54\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fm8qg" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.260881 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/28604003-081b-4192-af14-bab8aab39d15-etcd-service-ca\") pod \"etcd-operator-b45778765-xkx2c\" (UID: \"28604003-081b-4192-af14-bab8aab39d15\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xkx2c" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.260903 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9fbcec9-73cc-4d46-88d5-3596af15db5b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vftjt\" (UID: \"a9fbcec9-73cc-4d46-88d5-3596af15db5b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vftjt" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.262060 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2eb914e0-ee72-482c-a8d0-ba6b8f9f95d1-apiservice-cert\") pod \"packageserver-d55dfcdfc-b2tld\" (UID: \"2eb914e0-ee72-482c-a8d0-ba6b8f9f95d1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b2tld" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.262460 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/2186d01d-5c96-4c50-9414-bce7cd2d7052-plugins-dir\") pod \"csi-hostpathplugin-h645w\" (UID: \"2186d01d-5c96-4c50-9414-bce7cd2d7052\") " pod="hostpath-provisioner/csi-hostpathplugin-h645w" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.262664 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpzbq\" (UniqueName: \"kubernetes.io/projected/77902092-381a-451e-b462-de97532929fe-kube-api-access-fpzbq\") pod \"apiserver-7bbb656c7d-fpttx\" (UID: \"77902092-381a-451e-b462-de97532929fe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fpttx" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.263065 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/2eb914e0-ee72-482c-a8d0-ba6b8f9f95d1-tmpfs\") pod \"packageserver-d55dfcdfc-b2tld\" (UID: \"2eb914e0-ee72-482c-a8d0-ba6b8f9f95d1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b2tld" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.264550 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ddbb576c-7cc0-4b9e-89b2-6ca265b6aa6b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-42lsk\" (UID: \"ddbb576c-7cc0-4b9e-89b2-6ca265b6aa6b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-42lsk" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.264672 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9fbcec9-73cc-4d46-88d5-3596af15db5b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vftjt\" (UID: \"a9fbcec9-73cc-4d46-88d5-3596af15db5b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vftjt" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.264714 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2186d01d-5c96-4c50-9414-bce7cd2d7052-socket-dir\") pod \"csi-hostpathplugin-h645w\" (UID: \"2186d01d-5c96-4c50-9414-bce7cd2d7052\") " pod="hostpath-provisioner/csi-hostpathplugin-h645w" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.265092 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9365ecd-353b-4623-a174-45d558e9d16d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-d4656\" (UID: \"f9365ecd-353b-4623-a174-45d558e9d16d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d4656" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.265614 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9fbcec9-73cc-4d46-88d5-3596af15db5b-config\") pod \"kube-controller-manager-operator-78b949d7b-vftjt\" (UID: \"a9fbcec9-73cc-4d46-88d5-3596af15db5b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vftjt" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.265901 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5c5d5270-f70e-4691-8bbd-ed6bbf5ac09d-auth-proxy-config\") pod \"machine-config-operator-74547568cd-nhcxl\" (UID: \"5c5d5270-f70e-4691-8bbd-ed6bbf5ac09d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nhcxl" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.266032 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2186d01d-5c96-4c50-9414-bce7cd2d7052-registration-dir\") pod \"csi-hostpathplugin-h645w\" (UID: \"2186d01d-5c96-4c50-9414-bce7cd2d7052\") " pod="hostpath-provisioner/csi-hostpathplugin-h645w" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.266675 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d30e5a09-5492-426b-89a9-2331cce3cc87-config-volume\") pod \"collect-profiles-29323290-fh7nk\" (UID: \"d30e5a09-5492-426b-89a9-2331cce3cc87\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323290-fh7nk" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.266957 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28604003-081b-4192-af14-bab8aab39d15-config\") pod \"etcd-operator-b45778765-xkx2c\" (UID: \"28604003-081b-4192-af14-bab8aab39d15\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xkx2c" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.267588 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5c5d5270-f70e-4691-8bbd-ed6bbf5ac09d-images\") pod \"machine-config-operator-74547568cd-nhcxl\" (UID: \"5c5d5270-f70e-4691-8bbd-ed6bbf5ac09d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nhcxl" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.267735 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf859e5d-c395-4892-9abf-e1a9a9d21176-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-hpfrs\" (UID: \"bf859e5d-c395-4892-9abf-e1a9a9d21176\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hpfrs" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.267794 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9365ecd-353b-4623-a174-45d558e9d16d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-d4656\" (UID: \"f9365ecd-353b-4623-a174-45d558e9d16d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d4656" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.267975 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/2186d01d-5c96-4c50-9414-bce7cd2d7052-mountpoint-dir\") pod \"csi-hostpathplugin-h645w\" (UID: \"2186d01d-5c96-4c50-9414-bce7cd2d7052\") " pod="hostpath-provisioner/csi-hostpathplugin-h645w" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.268413 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb13bcef-7d21-460c-851d-96eeb995f612-config-volume\") pod \"dns-default-hwbdb\" (UID: \"cb13bcef-7d21-460c-851d-96eeb995f612\") " pod="openshift-dns/dns-default-hwbdb" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.268729 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92d2f480-8dcd-483c-ba02-41a2027b12ae-config\") pod \"service-ca-operator-777779d784-c5szj\" (UID: \"92d2f480-8dcd-483c-ba02-41a2027b12ae\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c5szj" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.269798 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84cae530-6db3-4e7e-bf5f-b4e25ce9d992-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nvrnd\" (UID: \"84cae530-6db3-4e7e-bf5f-b4e25ce9d992\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nvrnd" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.270388 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/28604003-081b-4192-af14-bab8aab39d15-etcd-client\") pod \"etcd-operator-b45778765-xkx2c\" (UID: \"28604003-081b-4192-af14-bab8aab39d15\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xkx2c" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.270587 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/2186d01d-5c96-4c50-9414-bce7cd2d7052-csi-data-dir\") pod \"csi-hostpathplugin-h645w\" (UID: \"2186d01d-5c96-4c50-9414-bce7cd2d7052\") " pod="hostpath-provisioner/csi-hostpathplugin-h645w" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.271104 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b59e4a45-c4df-4754-8e3e-83e750aee402-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qtsxk\" (UID: \"b59e4a45-c4df-4754-8e3e-83e750aee402\") " pod="openshift-marketplace/marketplace-operator-79b997595-qtsxk" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.271999 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/31485bfd-8f43-4046-a188-5b69955b5566-srv-cert\") pod \"olm-operator-6b444d44fb-422gm\" (UID: \"31485bfd-8f43-4046-a188-5b69955b5566\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-422gm" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.273996 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ddbb576c-7cc0-4b9e-89b2-6ca265b6aa6b-proxy-tls\") pod \"machine-config-controller-84d6567774-42lsk\" (UID: \"ddbb576c-7cc0-4b9e-89b2-6ca265b6aa6b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-42lsk" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.274544 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/31485bfd-8f43-4046-a188-5b69955b5566-profile-collector-cert\") pod \"olm-operator-6b444d44fb-422gm\" (UID: \"31485bfd-8f43-4046-a188-5b69955b5566\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-422gm" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.275670 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d79d2f7-de31-4ae6-b630-762dd6a5bef3-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-45mp5\" (UID: \"4d79d2f7-de31-4ae6-b630-762dd6a5bef3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-45mp5" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.276143 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/22202c1d-bb44-4648-9e5b-d8a29a94003f-signing-cabundle\") pod \"service-ca-9c57cc56f-h5twj\" (UID: \"22202c1d-bb44-4648-9e5b-d8a29a94003f\") " pod="openshift-service-ca/service-ca-9c57cc56f-h5twj" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.276154 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d79d2f7-de31-4ae6-b630-762dd6a5bef3-config\") pod \"kube-apiserver-operator-766d6c64bb-45mp5\" (UID: \"4d79d2f7-de31-4ae6-b630-762dd6a5bef3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-45mp5" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.276790 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/28604003-081b-4192-af14-bab8aab39d15-etcd-ca\") pod \"etcd-operator-b45778765-xkx2c\" (UID: \"28604003-081b-4192-af14-bab8aab39d15\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xkx2c" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.277762 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-pf6dm"] Oct 02 09:34:47 crc kubenswrapper[4722]: E1002 09:34:47.278150 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:47.77813201 +0000 UTC m=+143.814884990 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.280707 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2eb914e0-ee72-482c-a8d0-ba6b8f9f95d1-webhook-cert\") pod \"packageserver-d55dfcdfc-b2tld\" (UID: \"2eb914e0-ee72-482c-a8d0-ba6b8f9f95d1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b2tld" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.280981 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/9b22e0a2-1fea-4571-ad2b-e639bed4832c-certs\") pod \"machine-config-server-l7mvw\" (UID: \"9b22e0a2-1fea-4571-ad2b-e639bed4832c\") " pod="openshift-machine-config-operator/machine-config-server-l7mvw" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.281017 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28604003-081b-4192-af14-bab8aab39d15-serving-cert\") pod \"etcd-operator-b45778765-xkx2c\" (UID: \"28604003-081b-4192-af14-bab8aab39d15\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xkx2c" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.282508 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b59e4a45-c4df-4754-8e3e-83e750aee402-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qtsxk\" (UID: \"b59e4a45-c4df-4754-8e3e-83e750aee402\") " pod="openshift-marketplace/marketplace-operator-79b997595-qtsxk" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.282634 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bf4132d7-ccbc-4e9a-adb2-2b92ee9165d8-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-7ghd2\" (UID: \"bf4132d7-ccbc-4e9a-adb2-2b92ee9165d8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7ghd2" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.282736 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e6051c5a-7df2-4c89-a0c8-03d2bd3c2cc2-cert\") pod \"ingress-canary-8fqr9\" (UID: \"e6051c5a-7df2-4c89-a0c8-03d2bd3c2cc2\") " pod="openshift-ingress-canary/ingress-canary-8fqr9" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.282977 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2a98cf7d-1ef8-421e-85be-1bc5310adfdc-srv-cert\") pod \"catalog-operator-68c6474976-5bhmb\" (UID: \"2a98cf7d-1ef8-421e-85be-1bc5310adfdc\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5bhmb" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.283022 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5c5d5270-f70e-4691-8bbd-ed6bbf5ac09d-proxy-tls\") pod \"machine-config-operator-74547568cd-nhcxl\" (UID: \"5c5d5270-f70e-4691-8bbd-ed6bbf5ac09d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nhcxl" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.283343 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/22202c1d-bb44-4648-9e5b-d8a29a94003f-signing-key\") pod \"service-ca-9c57cc56f-h5twj\" (UID: \"22202c1d-bb44-4648-9e5b-d8a29a94003f\") " pod="openshift-service-ca/service-ca-9c57cc56f-h5twj" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.283409 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2a98cf7d-1ef8-421e-85be-1bc5310adfdc-profile-collector-cert\") pod \"catalog-operator-68c6474976-5bhmb\" (UID: \"2a98cf7d-1ef8-421e-85be-1bc5310adfdc\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5bhmb" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.283593 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb3e9b2a-3512-4b19-90d9-24bdbeafbd54-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-fm8qg\" (UID: \"eb3e9b2a-3512-4b19-90d9-24bdbeafbd54\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fm8qg" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.284388 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92d2f480-8dcd-483c-ba02-41a2027b12ae-serving-cert\") pod \"service-ca-operator-777779d784-c5szj\" (UID: \"92d2f480-8dcd-483c-ba02-41a2027b12ae\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c5szj" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.284623 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/28604003-081b-4192-af14-bab8aab39d15-etcd-service-ca\") pod \"etcd-operator-b45778765-xkx2c\" (UID: \"28604003-081b-4192-af14-bab8aab39d15\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xkx2c" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.290231 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84cae530-6db3-4e7e-bf5f-b4e25ce9d992-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nvrnd\" (UID: \"84cae530-6db3-4e7e-bf5f-b4e25ce9d992\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nvrnd" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.292959 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5fcdab41-6e2e-46fe-b9d0-1411949a51e9-metrics-tls\") pod \"dns-operator-744455d44c-xpzp4\" (UID: \"5fcdab41-6e2e-46fe-b9d0-1411949a51e9\") " pod="openshift-dns-operator/dns-operator-744455d44c-xpzp4" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.293755 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2537cd59-b6d6-46da-a99f-3f6306a19fdb-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-kbnrw\" (UID: \"2537cd59-b6d6-46da-a99f-3f6306a19fdb\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kbnrw" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.294587 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb13bcef-7d21-460c-851d-96eeb995f612-metrics-tls\") pod \"dns-default-hwbdb\" (UID: \"cb13bcef-7d21-460c-851d-96eeb995f612\") " pod="openshift-dns/dns-default-hwbdb" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.295190 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb3e9b2a-3512-4b19-90d9-24bdbeafbd54-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-fm8qg\" (UID: \"eb3e9b2a-3512-4b19-90d9-24bdbeafbd54\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fm8qg" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.295551 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/9b22e0a2-1fea-4571-ad2b-e639bed4832c-node-bootstrap-token\") pod \"machine-config-server-l7mvw\" (UID: \"9b22e0a2-1fea-4571-ad2b-e639bed4832c\") " pod="openshift-machine-config-operator/machine-config-server-l7mvw" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.302267 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjqnv\" (UniqueName: \"kubernetes.io/projected/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-kube-api-access-qjqnv\") pod \"oauth-openshift-558db77b4-ljxl7\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.303008 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d30e5a09-5492-426b-89a9-2331cce3cc87-secret-volume\") pod \"collect-profiles-29323290-fh7nk\" (UID: \"d30e5a09-5492-426b-89a9-2331cce3cc87\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323290-fh7nk" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.308510 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-942gl\" (UniqueName: \"kubernetes.io/projected/9b92581b-78c7-41ba-adac-0e8336d81508-kube-api-access-942gl\") pod \"cluster-image-registry-operator-dc59b4c8b-cr6pt\" (UID: \"9b92581b-78c7-41ba-adac-0e8336d81508\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cr6pt" Oct 02 09:34:47 crc kubenswrapper[4722]: W1002 09:34:47.311811 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ab29eb0_bca1_422c_a796_6c220bb76022.slice/crio-b75c0af343ac0965ca56b680b3401c8b52cd148d221f05b0b8be9caf53badfc3 WatchSource:0}: Error finding container b75c0af343ac0965ca56b680b3401c8b52cd148d221f05b0b8be9caf53badfc3: Status 404 returned error can't find the container with id b75c0af343ac0965ca56b680b3401c8b52cd148d221f05b0b8be9caf53badfc3 Oct 02 09:34:47 crc kubenswrapper[4722]: W1002 09:34:47.313073 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4104d0d5_52a2_47da_9cd0_86d4c52bc122.slice/crio-6a36b54e551113d0dcd7682bb6aa084a99cbe0acd2bb7e9dda37f7a5046b225f WatchSource:0}: Error finding container 6a36b54e551113d0dcd7682bb6aa084a99cbe0acd2bb7e9dda37f7a5046b225f: Status 404 returned error can't find the container with id 6a36b54e551113d0dcd7682bb6aa084a99cbe0acd2bb7e9dda37f7a5046b225f Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.324283 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94f8n\" (UniqueName: \"kubernetes.io/projected/4de37de4-2c3b-4742-96fe-e63227dfc04b-kube-api-access-94f8n\") pod \"ingress-operator-5b745b69d9-954gs\" (UID: \"4de37de4-2c3b-4742-96fe-e63227dfc04b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-954gs" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.334106 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.354907 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4de37de4-2c3b-4742-96fe-e63227dfc04b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-954gs\" (UID: \"4de37de4-2c3b-4742-96fe-e63227dfc04b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-954gs" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.362377 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:47 crc kubenswrapper[4722]: E1002 09:34:47.362560 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:47.86252914 +0000 UTC m=+143.899282130 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.363210 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:47 crc kubenswrapper[4722]: E1002 09:34:47.364685 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:47.864671296 +0000 UTC m=+143.901424276 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.372981 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbjfd\" (UniqueName: \"kubernetes.io/projected/ebdfe0bc-aee2-4b9e-bea3-e226541a08ea-kube-api-access-tbjfd\") pod \"cluster-samples-operator-665b6dd947-z5zmm\" (UID: \"ebdfe0bc-aee2-4b9e-bea3-e226541a08ea\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z5zmm" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.380578 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-r8rd5"] Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.383192 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zdf4\" (UniqueName: \"kubernetes.io/projected/9db8e859-4683-4df7-87e2-b84a8113cb35-kube-api-access-9zdf4\") pod \"authentication-operator-69f744f599-dgwxp\" (UID: \"9db8e859-4683-4df7-87e2-b84a8113cb35\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dgwxp" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.403240 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b9q7t"] Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.404763 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t2q9\" (UniqueName: \"kubernetes.io/projected/86bc5bcd-672b-4bb6-8090-1ed2d77fff50-kube-api-access-4t2q9\") pod \"machine-api-operator-5694c8668f-krj4j\" (UID: \"86bc5bcd-672b-4bb6-8090-1ed2d77fff50\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-krj4j" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.420257 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/160f98c9-f6b0-4d80-ab97-37487b03a785-bound-sa-token\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.433920 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-flf5c" event={"ID":"75b01f76-263d-4523-87e1-eb33d01164a7","Type":"ContainerStarted","Data":"4f3d36edf8e192fe2541f9bdaafaa20f82e9d580e01559e43cd48bac5cdfa72b"} Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.435803 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pf6dm" event={"ID":"4104d0d5-52a2-47da-9cd0-86d4c52bc122","Type":"ContainerStarted","Data":"6a36b54e551113d0dcd7682bb6aa084a99cbe0acd2bb7e9dda37f7a5046b225f"} Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.437800 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-s727n" event={"ID":"5ab29eb0-bca1-422c-a796-6c220bb76022","Type":"ContainerStarted","Data":"b75c0af343ac0965ca56b680b3401c8b52cd148d221f05b0b8be9caf53badfc3"} Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.450931 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbjf4\" (UniqueName: \"kubernetes.io/projected/6898b032-cd5c-42cf-8ebf-df3fec127c5e-kube-api-access-sbjf4\") pod \"console-f9d7485db-clnnl\" (UID: \"6898b032-cd5c-42cf-8ebf-df3fec127c5e\") " pod="openshift-console/console-f9d7485db-clnnl" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.451223 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-954gs" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.462179 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9b92581b-78c7-41ba-adac-0e8336d81508-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-cr6pt\" (UID: \"9b92581b-78c7-41ba-adac-0e8336d81508\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cr6pt" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.469242 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:47 crc kubenswrapper[4722]: E1002 09:34:47.469942 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:47.969921856 +0000 UTC m=+144.006674836 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.492932 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmcrw\" (UniqueName: \"kubernetes.io/projected/fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a-kube-api-access-tmcrw\") pod \"router-default-5444994796-6xfhq\" (UID: \"fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a\") " pod="openshift-ingress/router-default-5444994796-6xfhq" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.502833 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fpttx" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.504649 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn88h\" (UniqueName: \"kubernetes.io/projected/722a2a42-6f33-4002-ab72-3544df821d3b-kube-api-access-hn88h\") pod \"apiserver-76f77b778f-jkt7j\" (UID: \"722a2a42-6f33-4002-ab72-3544df821d3b\") " pod="openshift-apiserver/apiserver-76f77b778f-jkt7j" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.521202 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z5zmm" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.525332 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xlgs\" (UniqueName: \"kubernetes.io/projected/7fb5f55e-6d4f-465a-9e93-45a36a2ce4b3-kube-api-access-9xlgs\") pod \"downloads-7954f5f757-mfhl4\" (UID: \"7fb5f55e-6d4f-465a-9e93-45a36a2ce4b3\") " pod="openshift-console/downloads-7954f5f757-mfhl4" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.547667 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw9x6\" (UniqueName: \"kubernetes.io/projected/160f98c9-f6b0-4d80-ab97-37487b03a785-kube-api-access-lw9x6\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.571391 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:47 crc kubenswrapper[4722]: E1002 09:34:47.572101 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:48.072076356 +0000 UTC m=+144.108829356 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.573599 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ljxl7"] Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.575506 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2gv4\" (UniqueName: \"kubernetes.io/projected/e6070617-4dba-40fb-a6a8-03f6807a6846-kube-api-access-g2gv4\") pod \"controller-manager-879f6c89f-zk6t7\" (UID: \"e6070617-4dba-40fb-a6a8-03f6807a6846\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zk6t7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.602662 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vqbj\" (UniqueName: \"kubernetes.io/projected/b59e4a45-c4df-4754-8e3e-83e750aee402-kube-api-access-5vqbj\") pod \"marketplace-operator-79b997595-qtsxk\" (UID: \"b59e4a45-c4df-4754-8e3e-83e750aee402\") " pod="openshift-marketplace/marketplace-operator-79b997595-qtsxk" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.616038 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-krj4j" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.623366 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxswl\" (UniqueName: \"kubernetes.io/projected/bf4132d7-ccbc-4e9a-adb2-2b92ee9165d8-kube-api-access-hxswl\") pod \"multus-admission-controller-857f4d67dd-7ghd2\" (UID: \"bf4132d7-ccbc-4e9a-adb2-2b92ee9165d8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7ghd2" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.643588 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-clnnl" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.646262 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m8nh\" (UniqueName: \"kubernetes.io/projected/e6051c5a-7df2-4c89-a0c8-03d2bd3c2cc2-kube-api-access-8m8nh\") pod \"ingress-canary-8fqr9\" (UID: \"e6051c5a-7df2-4c89-a0c8-03d2bd3c2cc2\") " pod="openshift-ingress-canary/ingress-canary-8fqr9" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.651697 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-mfhl4" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.662300 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6kxc\" (UniqueName: \"kubernetes.io/projected/5fcdab41-6e2e-46fe-b9d0-1411949a51e9-kube-api-access-k6kxc\") pod \"dns-operator-744455d44c-xpzp4\" (UID: \"5fcdab41-6e2e-46fe-b9d0-1411949a51e9\") " pod="openshift-dns-operator/dns-operator-744455d44c-xpzp4" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.662729 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8fqr9" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.666256 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-dgwxp" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.672650 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:47 crc kubenswrapper[4722]: E1002 09:34:47.672849 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:48.172818201 +0000 UTC m=+144.209571181 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.673155 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:47 crc kubenswrapper[4722]: E1002 09:34:47.673678 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:48.173664993 +0000 UTC m=+144.210418143 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.682284 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cr6pt" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.685557 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v92ss\" (UniqueName: \"kubernetes.io/projected/28604003-081b-4192-af14-bab8aab39d15-kube-api-access-v92ss\") pod \"etcd-operator-b45778765-xkx2c\" (UID: \"28604003-081b-4192-af14-bab8aab39d15\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xkx2c" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.702698 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk2t2\" (UniqueName: \"kubernetes.io/projected/31485bfd-8f43-4046-a188-5b69955b5566-kube-api-access-qk2t2\") pod \"olm-operator-6b444d44fb-422gm\" (UID: \"31485bfd-8f43-4046-a188-5b69955b5566\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-422gm" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.718412 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-6xfhq" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.722050 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a9fbcec9-73cc-4d46-88d5-3596af15db5b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vftjt\" (UID: \"a9fbcec9-73cc-4d46-88d5-3596af15db5b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vftjt" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.746616 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/84cae530-6db3-4e7e-bf5f-b4e25ce9d992-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nvrnd\" (UID: \"84cae530-6db3-4e7e-bf5f-b4e25ce9d992\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nvrnd" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.760025 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-xpzp4" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.761830 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkwl4\" (UniqueName: \"kubernetes.io/projected/f9365ecd-353b-4623-a174-45d558e9d16d-kube-api-access-xkwl4\") pod \"openshift-controller-manager-operator-756b6f6bc6-d4656\" (UID: \"f9365ecd-353b-4623-a174-45d558e9d16d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d4656" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.765948 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vftjt" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.774038 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:47 crc kubenswrapper[4722]: E1002 09:34:47.774646 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:48.274624192 +0000 UTC m=+144.311377172 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.775570 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-954gs"] Oct 02 09:34:47 crc kubenswrapper[4722]: W1002 09:34:47.782289 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4de37de4_2c3b_4742_96fe_e63227dfc04b.slice/crio-dcf85342f1523812f1b70b0b48025970f9e7d5aa88c7511eea70d26f61c17ee5 WatchSource:0}: Error finding container dcf85342f1523812f1b70b0b48025970f9e7d5aa88c7511eea70d26f61c17ee5: Status 404 returned error can't find the container with id dcf85342f1523812f1b70b0b48025970f9e7d5aa88c7511eea70d26f61c17ee5 Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.786063 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-jkt7j" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.789823 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wns8\" (UniqueName: \"kubernetes.io/projected/69b26642-0e96-47a5-9154-9866670cb9e4-kube-api-access-7wns8\") pod \"migrator-59844c95c7-87568\" (UID: \"69b26642-0e96-47a5-9154-9866670cb9e4\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-87568" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.792430 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nvrnd" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.797277 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-fpttx"] Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.804959 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpvqb\" (UniqueName: \"kubernetes.io/projected/22202c1d-bb44-4648-9e5b-d8a29a94003f-kube-api-access-xpvqb\") pod \"service-ca-9c57cc56f-h5twj\" (UID: \"22202c1d-bb44-4648-9e5b-d8a29a94003f\") " pod="openshift-service-ca/service-ca-9c57cc56f-h5twj" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.821673 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcxxs\" (UniqueName: \"kubernetes.io/projected/2a98cf7d-1ef8-421e-85be-1bc5310adfdc-kube-api-access-lcxxs\") pod \"catalog-operator-68c6474976-5bhmb\" (UID: \"2a98cf7d-1ef8-421e-85be-1bc5310adfdc\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5bhmb" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.834290 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-xkx2c" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.842730 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-87568" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.843007 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h996\" (UniqueName: \"kubernetes.io/projected/2537cd59-b6d6-46da-a99f-3f6306a19fdb-kube-api-access-7h996\") pod \"package-server-manager-789f6589d5-kbnrw\" (UID: \"2537cd59-b6d6-46da-a99f-3f6306a19fdb\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kbnrw" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.851175 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-422gm" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.862867 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qtsxk" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.865763 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bnd6\" (UniqueName: \"kubernetes.io/projected/ddbb576c-7cc0-4b9e-89b2-6ca265b6aa6b-kube-api-access-2bnd6\") pod \"machine-config-controller-84d6567774-42lsk\" (UID: \"ddbb576c-7cc0-4b9e-89b2-6ca265b6aa6b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-42lsk" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.876593 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zk6t7" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.877177 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:47 crc kubenswrapper[4722]: E1002 09:34:47.878017 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:48.377997475 +0000 UTC m=+144.414750455 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.886654 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8kvn\" (UniqueName: \"kubernetes.io/projected/9b22e0a2-1fea-4571-ad2b-e639bed4832c-kube-api-access-g8kvn\") pod \"machine-config-server-l7mvw\" (UID: \"9b22e0a2-1fea-4571-ad2b-e639bed4832c\") " pod="openshift-machine-config-operator/machine-config-server-l7mvw" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.889557 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-7ghd2" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.902162 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcfm7\" (UniqueName: \"kubernetes.io/projected/2eb914e0-ee72-482c-a8d0-ba6b8f9f95d1-kube-api-access-wcfm7\") pod \"packageserver-d55dfcdfc-b2tld\" (UID: \"2eb914e0-ee72-482c-a8d0-ba6b8f9f95d1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b2tld" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.906896 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kbnrw" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.917866 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5bhmb" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.925675 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-h5twj" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.962460 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-l7mvw" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.963358 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk8bv\" (UniqueName: \"kubernetes.io/projected/d30e5a09-5492-426b-89a9-2331cce3cc87-kube-api-access-mk8bv\") pod \"collect-profiles-29323290-fh7nk\" (UID: \"d30e5a09-5492-426b-89a9-2331cce3cc87\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323290-fh7nk" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.967958 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z5zmm"] Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.973121 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzsbj\" (UniqueName: \"kubernetes.io/projected/5c5d5270-f70e-4691-8bbd-ed6bbf5ac09d-kube-api-access-mzsbj\") pod \"machine-config-operator-74547568cd-nhcxl\" (UID: \"5c5d5270-f70e-4691-8bbd-ed6bbf5ac09d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nhcxl" Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.978950 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:47 crc kubenswrapper[4722]: E1002 09:34:47.979734 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:48.479707994 +0000 UTC m=+144.516460974 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:47 crc kubenswrapper[4722]: I1002 09:34:47.993138 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d79d2f7-de31-4ae6-b630-762dd6a5bef3-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-45mp5\" (UID: \"4d79d2f7-de31-4ae6-b630-762dd6a5bef3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-45mp5" Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:47.999509 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p96n2\" (UniqueName: \"kubernetes.io/projected/cb13bcef-7d21-460c-851d-96eeb995f612-kube-api-access-p96n2\") pod \"dns-default-hwbdb\" (UID: \"cb13bcef-7d21-460c-851d-96eeb995f612\") " pod="openshift-dns/dns-default-hwbdb" Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.002528 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxhrx\" (UniqueName: \"kubernetes.io/projected/2186d01d-5c96-4c50-9414-bce7cd2d7052-kube-api-access-zxhrx\") pod \"csi-hostpathplugin-h645w\" (UID: \"2186d01d-5c96-4c50-9414-bce7cd2d7052\") " pod="hostpath-provisioner/csi-hostpathplugin-h645w" Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.030722 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n65hq\" (UniqueName: \"kubernetes.io/projected/92d2f480-8dcd-483c-ba02-41a2027b12ae-kube-api-access-n65hq\") pod \"service-ca-operator-777779d784-c5szj\" (UID: \"92d2f480-8dcd-483c-ba02-41a2027b12ae\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c5szj" Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.042915 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d4656" Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.059054 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mblch\" (UniqueName: \"kubernetes.io/projected/bf859e5d-c395-4892-9abf-e1a9a9d21176-kube-api-access-mblch\") pod \"control-plane-machine-set-operator-78cbb6b69f-hpfrs\" (UID: \"bf859e5d-c395-4892-9abf-e1a9a9d21176\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hpfrs" Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.080767 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwgsv\" (UniqueName: \"kubernetes.io/projected/eb3e9b2a-3512-4b19-90d9-24bdbeafbd54-kube-api-access-lwgsv\") pod \"kube-storage-version-migrator-operator-b67b599dd-fm8qg\" (UID: \"eb3e9b2a-3512-4b19-90d9-24bdbeafbd54\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fm8qg" Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.082235 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:48 crc kubenswrapper[4722]: E1002 09:34:48.082835 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:48.58281243 +0000 UTC m=+144.619565410 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.083507 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-42lsk" Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.097018 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-mfhl4"] Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.116796 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-45mp5" Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.127626 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nhcxl" Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.167340 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-c5szj" Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.174598 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323290-fh7nk" Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.183254 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:48 crc kubenswrapper[4722]: E1002 09:34:48.183833 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:48.68381284 +0000 UTC m=+144.720565820 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.183862 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vftjt"] Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.184589 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hpfrs" Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.196566 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-clnnl"] Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.197937 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b2tld" Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.243979 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-h645w" Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.269049 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hwbdb" Oct 02 09:34:48 crc kubenswrapper[4722]: W1002 09:34:48.276300 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc04bfb1_a85e_43f5_8f5f_1a4c8fd0fe7a.slice/crio-70349e4c420d51bc974ab3b1e330300e972bd62ab716c728f338d3ccfdb724c1 WatchSource:0}: Error finding container 70349e4c420d51bc974ab3b1e330300e972bd62ab716c728f338d3ccfdb724c1: Status 404 returned error can't find the container with id 70349e4c420d51bc974ab3b1e330300e972bd62ab716c728f338d3ccfdb724c1 Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.284744 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:48 crc kubenswrapper[4722]: E1002 09:34:48.285616 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:48.78556131 +0000 UTC m=+144.822314500 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.293751 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dgwxp"] Oct 02 09:34:48 crc kubenswrapper[4722]: W1002 09:34:48.336954 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9fbcec9_73cc_4d46_88d5_3596af15db5b.slice/crio-6054ff3f5b218cc516c7a4bed19dab049e7ccc61e0efbdd1b7ec5043eff66837 WatchSource:0}: Error finding container 6054ff3f5b218cc516c7a4bed19dab049e7ccc61e0efbdd1b7ec5043eff66837: Status 404 returned error can't find the container with id 6054ff3f5b218cc516c7a4bed19dab049e7ccc61e0efbdd1b7ec5043eff66837 Oct 02 09:34:48 crc kubenswrapper[4722]: W1002 09:34:48.363584 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9db8e859_4683_4df7_87e2_b84a8113cb35.slice/crio-557097a30c83462f9bb6a130e522baa313030a0adc986521155526fe286a55fe WatchSource:0}: Error finding container 557097a30c83462f9bb6a130e522baa313030a0adc986521155526fe286a55fe: Status 404 returned error can't find the container with id 557097a30c83462f9bb6a130e522baa313030a0adc986521155526fe286a55fe Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.373760 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fm8qg" Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.386584 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:48 crc kubenswrapper[4722]: E1002 09:34:48.386834 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:48.886765215 +0000 UTC m=+144.923518195 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.386949 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:48 crc kubenswrapper[4722]: E1002 09:34:48.387698 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:48.887682299 +0000 UTC m=+144.924435279 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.448909 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" event={"ID":"db4cb93f-b00a-42b1-b5d2-704e6c6010c6","Type":"ContainerStarted","Data":"5e3092471504a86095de8d2e8b496d377696a22976e94ff192c7b421077a1a3d"} Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.451505 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-krj4j"] Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.459959 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-954gs" event={"ID":"4de37de4-2c3b-4742-96fe-e63227dfc04b","Type":"ContainerStarted","Data":"dcf85342f1523812f1b70b0b48025970f9e7d5aa88c7511eea70d26f61c17ee5"} Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.472014 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-mfhl4" event={"ID":"7fb5f55e-6d4f-465a-9e93-45a36a2ce4b3","Type":"ContainerStarted","Data":"fa746738dbfed892e7aea9a87c0c162a4381fecbbcb556e94e28d1f618521ebb"} Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.472910 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-6xfhq" event={"ID":"fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a","Type":"ContainerStarted","Data":"70349e4c420d51bc974ab3b1e330300e972bd62ab716c728f338d3ccfdb724c1"} Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.474551 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cr6pt"] Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.475423 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fpttx" event={"ID":"77902092-381a-451e-b462-de97532929fe","Type":"ContainerStarted","Data":"937d84e895c4fd778865a384fb458e317c9cc8689d3fcb39497315da36cc5ae5"} Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.481491 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-clnnl" event={"ID":"6898b032-cd5c-42cf-8ebf-df3fec127c5e","Type":"ContainerStarted","Data":"d42847f1c0461faaac13693a9bd30dab9969537e9555814b56302996fa9eb254"} Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.489709 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:48 crc kubenswrapper[4722]: E1002 09:34:48.490172 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:48.990145868 +0000 UTC m=+145.026898848 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.568818 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-xpzp4"] Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.579773 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vftjt" event={"ID":"a9fbcec9-73cc-4d46-88d5-3596af15db5b","Type":"ContainerStarted","Data":"6054ff3f5b218cc516c7a4bed19dab049e7ccc61e0efbdd1b7ec5043eff66837"} Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.592268 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:48 crc kubenswrapper[4722]: E1002 09:34:48.593004 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:49.092987117 +0000 UTC m=+145.129740097 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.593150 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-s727n" event={"ID":"5ab29eb0-bca1-422c-a796-6c220bb76022","Type":"ContainerStarted","Data":"07c59c69d325ecd293584beded5e24acda9138c8a8a1a3e90a20239ba5195feb"} Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.593598 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-s727n" Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.597526 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-jkt7j"] Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.617474 4722 patch_prober.go:28] interesting pod/console-operator-58897d9998-s727n container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/readyz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.617554 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-s727n" podUID="5ab29eb0-bca1-422c-a796-6c220bb76022" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/readyz\": dial tcp 10.217.0.17:8443: connect: connection refused" Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.628264 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-dgwxp" event={"ID":"9db8e859-4683-4df7-87e2-b84a8113cb35","Type":"ContainerStarted","Data":"557097a30c83462f9bb6a130e522baa313030a0adc986521155526fe286a55fe"} Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.655955 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b9q7t" event={"ID":"75d348d9-43ce-41be-afdf-1a35627522c8","Type":"ContainerStarted","Data":"8a1b71133a5d2531f0461cb48f5c2c2a0ad9566674fb3d56f006fc9a161159c0"} Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.655996 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b9q7t" event={"ID":"75d348d9-43ce-41be-afdf-1a35627522c8","Type":"ContainerStarted","Data":"542017cf87fcbc452ed3ad6617569871ee7205e01e0aeacff280302591b8ac7b"} Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.669972 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-flf5c" event={"ID":"75b01f76-263d-4523-87e1-eb33d01164a7","Type":"ContainerStarted","Data":"907e88811811acb449a63610c91ceee0ad46b9d74ae9bc72baee04201216f027"} Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.671600 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pf6dm" event={"ID":"4104d0d5-52a2-47da-9cd0-86d4c52bc122","Type":"ContainerStarted","Data":"f0d0a6890e32e6337bf39f4e43cfa0b6f045e101f7a3f7698da8337461d2c197"} Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.672451 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pf6dm" Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.673687 4722 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-pf6dm container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.673733 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pf6dm" podUID="4104d0d5-52a2-47da-9cd0-86d4c52bc122" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.675204 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-r8rd5" event={"ID":"e5de8035-7427-4646-ad95-13c58ecad616","Type":"ContainerStarted","Data":"fc2529d8c748eec5ed68fa1607ef2536a261ce16fe16f82c306db976f6a11a7a"} Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.675257 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-r8rd5" event={"ID":"e5de8035-7427-4646-ad95-13c58ecad616","Type":"ContainerStarted","Data":"22899dc6508278cefd9f81dbc21be6dde0fa401f22109b294977e2d769c26714"} Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.694019 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:48 crc kubenswrapper[4722]: E1002 09:34:48.694288 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:49.194269875 +0000 UTC m=+145.231022855 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.694429 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:48 crc kubenswrapper[4722]: E1002 09:34:48.696652 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:49.196643106 +0000 UTC m=+145.233396076 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.705821 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8fqr9"] Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.720611 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-s727n" podStartSLOduration=121.720581788 podStartE2EDuration="2m1.720581788s" podCreationTimestamp="2025-10-02 09:32:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:48.718856913 +0000 UTC m=+144.755609893" watchObservedRunningTime="2025-10-02 09:34:48.720581788 +0000 UTC m=+144.757334768" Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.795645 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:48 crc kubenswrapper[4722]: E1002 09:34:48.796511 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:49.296420515 +0000 UTC m=+145.333173495 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.898261 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:48 crc kubenswrapper[4722]: E1002 09:34:48.899925 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:49.399904751 +0000 UTC m=+145.436657731 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:48 crc kubenswrapper[4722]: I1002 09:34:48.965685 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nvrnd"] Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.000385 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:49 crc kubenswrapper[4722]: E1002 09:34:49.000941 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:49.500911902 +0000 UTC m=+145.537664892 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.104293 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:49 crc kubenswrapper[4722]: E1002 09:34:49.105781 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:49.605761382 +0000 UTC m=+145.642514542 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.133355 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qtsxk"] Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.171046 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-87568"] Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.173843 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-422gm"] Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.175696 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xkx2c"] Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.215732 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:49 crc kubenswrapper[4722]: E1002 09:34:49.216235 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:49.716208368 +0000 UTC m=+145.752961348 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.317024 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:49 crc kubenswrapper[4722]: E1002 09:34:49.317458 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:49.817445605 +0000 UTC m=+145.854198585 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.319874 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kbnrw"] Oct 02 09:34:49 crc kubenswrapper[4722]: W1002 09:34:49.373143 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6051c5a_7df2_4c89_a0c8_03d2bd3c2cc2.slice/crio-97f392ac0477554749422b8ec112fdc25e352a7446e153c2342bef85d427f665 WatchSource:0}: Error finding container 97f392ac0477554749422b8ec112fdc25e352a7446e153c2342bef85d427f665: Status 404 returned error can't find the container with id 97f392ac0477554749422b8ec112fdc25e352a7446e153c2342bef85d427f665 Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.420232 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:49 crc kubenswrapper[4722]: E1002 09:34:49.421476 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:49.921456754 +0000 UTC m=+145.958209734 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.428435 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d4656"] Oct 02 09:34:49 crc kubenswrapper[4722]: W1002 09:34:49.430124 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84cae530_6db3_4e7e_bf5f_b4e25ce9d992.slice/crio-c9090526be6124e5d59cc7737a8305a7ff85025224f4897f9398872fb4f9045d WatchSource:0}: Error finding container c9090526be6124e5d59cc7737a8305a7ff85025224f4897f9398872fb4f9045d: Status 404 returned error can't find the container with id c9090526be6124e5d59cc7737a8305a7ff85025224f4897f9398872fb4f9045d Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.432419 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323290-fh7nk"] Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.433331 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-h5twj"] Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.446071 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5bhmb"] Oct 02 09:34:49 crc kubenswrapper[4722]: W1002 09:34:49.457779 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31485bfd_8f43_4046_a188_5b69955b5566.slice/crio-30291610ee31fa1a84a101caec45f1aa9b59c3f905a536e699e7b9f178195715 WatchSource:0}: Error finding container 30291610ee31fa1a84a101caec45f1aa9b59c3f905a536e699e7b9f178195715: Status 404 returned error can't find the container with id 30291610ee31fa1a84a101caec45f1aa9b59c3f905a536e699e7b9f178195715 Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.459065 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zk6t7"] Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.461643 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-7ghd2"] Oct 02 09:34:49 crc kubenswrapper[4722]: W1002 09:34:49.495784 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9365ecd_353b_4623_a174_45d558e9d16d.slice/crio-3151ec6761ec72b6c4bece4f347299f4ca9946414be67e00274d40b2fab2820d WatchSource:0}: Error finding container 3151ec6761ec72b6c4bece4f347299f4ca9946414be67e00274d40b2fab2820d: Status 404 returned error can't find the container with id 3151ec6761ec72b6c4bece4f347299f4ca9946414be67e00274d40b2fab2820d Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.525366 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b2tld"] Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.528039 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:49 crc kubenswrapper[4722]: E1002 09:34:49.528443 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:50.02842719 +0000 UTC m=+146.065180170 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.571041 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-nhcxl"] Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.572652 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-45mp5"] Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.580872 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fm8qg"] Oct 02 09:34:49 crc kubenswrapper[4722]: W1002 09:34:49.594862 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf4132d7_ccbc_4e9a_adb2_2b92ee9165d8.slice/crio-b05697abd65680f9abe6ab754896fd79e88f4cd5875dc6305448632fa319c7a7 WatchSource:0}: Error finding container b05697abd65680f9abe6ab754896fd79e88f4cd5875dc6305448632fa319c7a7: Status 404 returned error can't find the container with id b05697abd65680f9abe6ab754896fd79e88f4cd5875dc6305448632fa319c7a7 Oct 02 09:34:49 crc kubenswrapper[4722]: W1002 09:34:49.625867 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6070617_4dba_40fb_a6a8_03f6807a6846.slice/crio-e1041cd64d5292462736a179ea4c83f1274dba1122330f23dba78842236b2f70 WatchSource:0}: Error finding container e1041cd64d5292462736a179ea4c83f1274dba1122330f23dba78842236b2f70: Status 404 returned error can't find the container with id e1041cd64d5292462736a179ea4c83f1274dba1122330f23dba78842236b2f70 Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.629199 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:49 crc kubenswrapper[4722]: E1002 09:34:49.629799 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:50.12977668 +0000 UTC m=+146.166529660 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.670558 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-42lsk"] Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.731066 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b9q7t" podStartSLOduration=122.731045918 podStartE2EDuration="2m2.731045918s" podCreationTimestamp="2025-10-02 09:32:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:49.718886952 +0000 UTC m=+145.755639922" watchObservedRunningTime="2025-10-02 09:34:49.731045918 +0000 UTC m=+145.767798888" Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.731123 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-h645w"] Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.735464 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:49 crc kubenswrapper[4722]: E1002 09:34:49.735949 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:50.235930505 +0000 UTC m=+146.272683485 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.747668 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jkt7j" event={"ID":"722a2a42-6f33-4002-ab72-3544df821d3b","Type":"ContainerStarted","Data":"6d71cd689c7bcd6905fd4e9e887d2b3871204818b2779a833475659c96f3a5b5"} Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.750045 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8fqr9" event={"ID":"e6051c5a-7df2-4c89-a0c8-03d2bd3c2cc2","Type":"ContainerStarted","Data":"97f392ac0477554749422b8ec112fdc25e352a7446e153c2342bef85d427f665"} Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.760054 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-xpzp4" event={"ID":"5fcdab41-6e2e-46fe-b9d0-1411949a51e9","Type":"ContainerStarted","Data":"81d0bff73725fee485c4a873d002062369f288ae6812f346c59d4d723020a088"} Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.762250 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-l7mvw" event={"ID":"9b22e0a2-1fea-4571-ad2b-e639bed4832c","Type":"ContainerStarted","Data":"85d30583b9bc15e4b12aeb0e84c91048159eb6249383ed23e6e87c1cb446f7c9"} Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.768484 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pf6dm" podStartSLOduration=121.768455859 podStartE2EDuration="2m1.768455859s" podCreationTimestamp="2025-10-02 09:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:49.760190854 +0000 UTC m=+145.796943834" watchObservedRunningTime="2025-10-02 09:34:49.768455859 +0000 UTC m=+145.805208869" Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.768599 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fm8qg" event={"ID":"eb3e9b2a-3512-4b19-90d9-24bdbeafbd54","Type":"ContainerStarted","Data":"f091bcb7877006584074eb354efd3d6017b9bbd8ed894ebfcf0af311244227f8"} Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.773786 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-87568" event={"ID":"69b26642-0e96-47a5-9154-9866670cb9e4","Type":"ContainerStarted","Data":"5f4af015fca9a8f4ea992efbc36b2fa927636c3bb3a165d1ccdefb76c5b4b2d1"} Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.775248 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-krj4j" event={"ID":"86bc5bcd-672b-4bb6-8090-1ed2d77fff50","Type":"ContainerStarted","Data":"3fe1dfb0373aec611152b5e95b54836595c1024d47b19d808a32072277c20126"} Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.783678 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323290-fh7nk" event={"ID":"d30e5a09-5492-426b-89a9-2331cce3cc87","Type":"ContainerStarted","Data":"af9e2cbe63a20dcb7059986ca2ed21ddccd901f1a391ac997340b63e1edfd8a0"} Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.786029 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hpfrs"] Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.788411 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b2tld" event={"ID":"2eb914e0-ee72-482c-a8d0-ba6b8f9f95d1","Type":"ContainerStarted","Data":"1c5da181440d9b8959773a16c4f4def02b0d6dea117dcddd7e36de036dc3a752"} Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.792009 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-45mp5" event={"ID":"4d79d2f7-de31-4ae6-b630-762dd6a5bef3","Type":"ContainerStarted","Data":"8cf976ca7d9d5bf2783a86ed8da50a7521901509e2e4cee87d86b65b4a511eed"} Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.797325 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zk6t7" event={"ID":"e6070617-4dba-40fb-a6a8-03f6807a6846","Type":"ContainerStarted","Data":"e1041cd64d5292462736a179ea4c83f1274dba1122330f23dba78842236b2f70"} Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.812522 4722 generic.go:334] "Generic (PLEG): container finished" podID="e5de8035-7427-4646-ad95-13c58ecad616" containerID="fc2529d8c748eec5ed68fa1607ef2536a261ce16fe16f82c306db976f6a11a7a" exitCode=0 Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.812639 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-r8rd5" event={"ID":"e5de8035-7427-4646-ad95-13c58ecad616","Type":"ContainerDied","Data":"fc2529d8c748eec5ed68fa1607ef2536a261ce16fe16f82c306db976f6a11a7a"} Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.818754 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-xkx2c" event={"ID":"28604003-081b-4192-af14-bab8aab39d15","Type":"ContainerStarted","Data":"aea9bff7e65d17fa61f2469bb0a4f7585b4bec0d6fcba4bcf6809c9e871b5db3"} Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.819134 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-c5szj"] Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.820854 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-h5twj" event={"ID":"22202c1d-bb44-4648-9e5b-d8a29a94003f","Type":"ContainerStarted","Data":"ba3059ae87d2cacc33a9d3e0c9ed2744f2b4b867d52d76c1391d0ab7a857a530"} Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.835145 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z5zmm" event={"ID":"ebdfe0bc-aee2-4b9e-bea3-e226541a08ea","Type":"ContainerStarted","Data":"40cf34461c97ebcdeea1a10589e44058d861bef9d0d5b0193be62be4732684e3"} Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.836812 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:49 crc kubenswrapper[4722]: E1002 09:34:49.837299 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:50.337249944 +0000 UTC m=+146.374002924 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.837409 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:49 crc kubenswrapper[4722]: E1002 09:34:49.838032 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:50.338009183 +0000 UTC m=+146.374762333 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.845770 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kbnrw" event={"ID":"2537cd59-b6d6-46da-a99f-3f6306a19fdb","Type":"ContainerStarted","Data":"509164273b03b87840f1b543bca9c51dba0bf58474c3e18920c99da2e4152261"} Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.851150 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qtsxk" event={"ID":"b59e4a45-c4df-4754-8e3e-83e750aee402","Type":"ContainerStarted","Data":"76371d9dd430f1ddf40e267b1670ad84e39a2c4c2e75024f29f91f3a7db2c58c"} Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.853822 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-954gs" event={"ID":"4de37de4-2c3b-4742-96fe-e63227dfc04b","Type":"ContainerStarted","Data":"5f001e835aff993832e019154342133825af1edbdbb656f3f70b4804ce27232c"} Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.854568 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nhcxl" event={"ID":"5c5d5270-f70e-4691-8bbd-ed6bbf5ac09d","Type":"ContainerStarted","Data":"69c6d712107d563bed16a4dc01b70f937a46627888fb70dde417f318b7b3946d"} Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.855235 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-7ghd2" event={"ID":"bf4132d7-ccbc-4e9a-adb2-2b92ee9165d8","Type":"ContainerStarted","Data":"b05697abd65680f9abe6ab754896fd79e88f4cd5875dc6305448632fa319c7a7"} Oct 02 09:34:49 crc kubenswrapper[4722]: W1002 09:34:49.857535 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2186d01d_5c96_4c50_9414_bce7cd2d7052.slice/crio-3bebd178b2a0632bed32be5c305e9e92da9d18d1dce95932194d901ccb80f365 WatchSource:0}: Error finding container 3bebd178b2a0632bed32be5c305e9e92da9d18d1dce95932194d901ccb80f365: Status 404 returned error can't find the container with id 3bebd178b2a0632bed32be5c305e9e92da9d18d1dce95932194d901ccb80f365 Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.859408 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5bhmb" event={"ID":"2a98cf7d-1ef8-421e-85be-1bc5310adfdc","Type":"ContainerStarted","Data":"485e7679ee35dc7be37b59dcbc770a7c737cfeaa6df4c88130fc9e0d8d10a5b1"} Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.863908 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cr6pt" event={"ID":"9b92581b-78c7-41ba-adac-0e8336d81508","Type":"ContainerStarted","Data":"44a39b285f06ca2e1ca7df91b89130abe255f7440f75069ae5911ecebd6303f3"} Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.865230 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-422gm" event={"ID":"31485bfd-8f43-4046-a188-5b69955b5566","Type":"ContainerStarted","Data":"30291610ee31fa1a84a101caec45f1aa9b59c3f905a536e699e7b9f178195715"} Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.866009 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hwbdb"] Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.868469 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d4656" event={"ID":"f9365ecd-353b-4623-a174-45d558e9d16d","Type":"ContainerStarted","Data":"3151ec6761ec72b6c4bece4f347299f4ca9946414be67e00274d40b2fab2820d"} Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.869701 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nvrnd" event={"ID":"84cae530-6db3-4e7e-bf5f-b4e25ce9d992","Type":"ContainerStarted","Data":"c9090526be6124e5d59cc7737a8305a7ff85025224f4897f9398872fb4f9045d"} Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.870756 4722 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-pf6dm container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.870802 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pf6dm" podUID="4104d0d5-52a2-47da-9cd0-86d4c52bc122" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.870830 4722 patch_prober.go:28] interesting pod/console-operator-58897d9998-s727n container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/readyz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.870877 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-s727n" podUID="5ab29eb0-bca1-422c-a796-6c220bb76022" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/readyz\": dial tcp 10.217.0.17:8443: connect: connection refused" Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.938817 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:49 crc kubenswrapper[4722]: E1002 09:34:49.939038 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:50.438997224 +0000 UTC m=+146.475750204 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:49 crc kubenswrapper[4722]: I1002 09:34:49.940284 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:49 crc kubenswrapper[4722]: E1002 09:34:49.940828 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:50.440805781 +0000 UTC m=+146.477558991 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:49 crc kubenswrapper[4722]: W1002 09:34:49.942167 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb13bcef_7d21_460c_851d_96eeb995f612.slice/crio-eef4e2bdacbe410f4b9e7715b4a1072e8772b25eda33b80133c2ee5a3b8bacce WatchSource:0}: Error finding container eef4e2bdacbe410f4b9e7715b4a1072e8772b25eda33b80133c2ee5a3b8bacce: Status 404 returned error can't find the container with id eef4e2bdacbe410f4b9e7715b4a1072e8772b25eda33b80133c2ee5a3b8bacce Oct 02 09:34:49 crc kubenswrapper[4722]: W1002 09:34:49.943421 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92d2f480_8dcd_483c_ba02_41a2027b12ae.slice/crio-241c9df0d2035d5e1c600015706cb81f989666de5650cb92b5868334a9f9dc55 WatchSource:0}: Error finding container 241c9df0d2035d5e1c600015706cb81f989666de5650cb92b5868334a9f9dc55: Status 404 returned error can't find the container with id 241c9df0d2035d5e1c600015706cb81f989666de5650cb92b5868334a9f9dc55 Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.041091 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:50 crc kubenswrapper[4722]: E1002 09:34:50.041545 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:50.541525704 +0000 UTC m=+146.578278684 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.142890 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:50 crc kubenswrapper[4722]: E1002 09:34:50.143530 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:50.643509561 +0000 UTC m=+146.680262541 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.244089 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:50 crc kubenswrapper[4722]: E1002 09:34:50.244403 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:50.744351357 +0000 UTC m=+146.781104337 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.245202 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:50 crc kubenswrapper[4722]: E1002 09:34:50.245685 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:50.745673662 +0000 UTC m=+146.782426642 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.346967 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:50 crc kubenswrapper[4722]: E1002 09:34:50.347567 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:50.847541845 +0000 UTC m=+146.884294835 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.347819 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:50 crc kubenswrapper[4722]: E1002 09:34:50.348243 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:50.848233463 +0000 UTC m=+146.884986443 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.449482 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:50 crc kubenswrapper[4722]: E1002 09:34:50.450138 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:50.950117587 +0000 UTC m=+146.986870567 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.551549 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:50 crc kubenswrapper[4722]: E1002 09:34:50.552031 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:51.052006541 +0000 UTC m=+147.088759701 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.652558 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:50 crc kubenswrapper[4722]: E1002 09:34:50.652890 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:51.152857988 +0000 UTC m=+147.189610968 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.653880 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:50 crc kubenswrapper[4722]: E1002 09:34:50.654680 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:51.154662484 +0000 UTC m=+147.191415464 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.755405 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:50 crc kubenswrapper[4722]: E1002 09:34:50.755618 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:51.255579893 +0000 UTC m=+147.292332873 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.756287 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:50 crc kubenswrapper[4722]: E1002 09:34:50.756662 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:51.256646771 +0000 UTC m=+147.293399751 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.857108 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:50 crc kubenswrapper[4722]: E1002 09:34:50.857296 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:51.357267592 +0000 UTC m=+147.394020572 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.857560 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:50 crc kubenswrapper[4722]: E1002 09:34:50.857993 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:51.35797442 +0000 UTC m=+147.394727420 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.877380 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-6xfhq" event={"ID":"fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a","Type":"ContainerStarted","Data":"8adba6ce6258fdb064ad3caf935d7889b573ca781849707b97627371bc7ee7d0"} Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.879010 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-clnnl" event={"ID":"6898b032-cd5c-42cf-8ebf-df3fec127c5e","Type":"ContainerStarted","Data":"5181275d2bf41157d6d3e50189920adf944ccd79646654a38320a49c8add4914"} Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.880345 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hpfrs" event={"ID":"bf859e5d-c395-4892-9abf-e1a9a9d21176","Type":"ContainerStarted","Data":"e54254c8aba1e50ccb40e46a055dec8a2baad4796cb1e6942d5300077033d3fe"} Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.881722 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-h5twj" event={"ID":"22202c1d-bb44-4648-9e5b-d8a29a94003f","Type":"ContainerStarted","Data":"d61e65c3a6def33190d2b46efc361eb1a85767daa519c64e330c0e5567599534"} Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.883616 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-42lsk" event={"ID":"ddbb576c-7cc0-4b9e-89b2-6ca265b6aa6b","Type":"ContainerStarted","Data":"fddc66e5242da291d4381d687f6349df1e9ce458ae8dc7f6721906825f1f8ff5"} Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.884872 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-xpzp4" event={"ID":"5fcdab41-6e2e-46fe-b9d0-1411949a51e9","Type":"ContainerStarted","Data":"fede6d1132ad26e29b5609aced11c731bed60ab448a76fd8f31cbd6ebd55244b"} Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.886131 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-krj4j" event={"ID":"86bc5bcd-672b-4bb6-8090-1ed2d77fff50","Type":"ContainerStarted","Data":"5c4078086abdc1e4c3cee1e7a47100ab1e38f8838f0c0304f5a1905eda9a10f8"} Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.886994 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hwbdb" event={"ID":"cb13bcef-7d21-460c-851d-96eeb995f612","Type":"ContainerStarted","Data":"eef4e2bdacbe410f4b9e7715b4a1072e8772b25eda33b80133c2ee5a3b8bacce"} Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.888071 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-dgwxp" event={"ID":"9db8e859-4683-4df7-87e2-b84a8113cb35","Type":"ContainerStarted","Data":"48d772814d3006da26d6ffca8eba6e83184184148048c39f967f47a291c55413"} Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.890577 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-l7mvw" event={"ID":"9b22e0a2-1fea-4571-ad2b-e639bed4832c","Type":"ContainerStarted","Data":"bc990eb7728bf1a1c1b3738ed3ab76f8cfe4ffca9b09fb7286991fea4867fc41"} Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.891731 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z5zmm" event={"ID":"ebdfe0bc-aee2-4b9e-bea3-e226541a08ea","Type":"ContainerStarted","Data":"5c7010a8ddb018e51d7e4449604027ca7acda32cece7319e5cc80b36aad533b6"} Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.893138 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5bhmb" event={"ID":"2a98cf7d-1ef8-421e-85be-1bc5310adfdc","Type":"ContainerStarted","Data":"835efe8d9e156e60142cf8ca41718f891f3cc2d8744574a3b9facdef9e40e06c"} Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.894621 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-422gm" event={"ID":"31485bfd-8f43-4046-a188-5b69955b5566","Type":"ContainerStarted","Data":"c1005d253d9efd4a9e6a3974e648d0a8b3611990fb97044d0ce89ab44b3028fc"} Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.898276 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-6xfhq" podStartSLOduration=122.898251535 podStartE2EDuration="2m2.898251535s" podCreationTimestamp="2025-10-02 09:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:50.898224185 +0000 UTC m=+146.934977185" watchObservedRunningTime="2025-10-02 09:34:50.898251535 +0000 UTC m=+146.935004515" Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.908899 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jkt7j" event={"ID":"722a2a42-6f33-4002-ab72-3544df821d3b","Type":"ContainerStarted","Data":"8e89e3a99e487ca2c19ca66e6f1bf67c295780e9914a9bcccbe44e46b09645c9"} Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.912109 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-h645w" event={"ID":"2186d01d-5c96-4c50-9414-bce7cd2d7052","Type":"ContainerStarted","Data":"3bebd178b2a0632bed32be5c305e9e92da9d18d1dce95932194d901ccb80f365"} Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.916301 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-c5szj" event={"ID":"92d2f480-8dcd-483c-ba02-41a2027b12ae","Type":"ContainerStarted","Data":"241c9df0d2035d5e1c600015706cb81f989666de5650cb92b5868334a9f9dc55"} Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.930373 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-flf5c" event={"ID":"75b01f76-263d-4523-87e1-eb33d01164a7","Type":"ContainerStarted","Data":"c304213e0ffa2421f305c8f46c2ca00bc294c9ed671f9ab8ea40f0646cbf058d"} Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.934018 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-dgwxp" podStartSLOduration=123.933979102 podStartE2EDuration="2m3.933979102s" podCreationTimestamp="2025-10-02 09:32:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:50.930158313 +0000 UTC m=+146.966911293" watchObservedRunningTime="2025-10-02 09:34:50.933979102 +0000 UTC m=+146.970732082" Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.937219 4722 generic.go:334] "Generic (PLEG): container finished" podID="77902092-381a-451e-b462-de97532929fe" containerID="cec5dafa92d98fbda7e451dd6b304ecce4b70f998e6b3713d8060bd1fb16bdc8" exitCode=0 Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.937386 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fpttx" event={"ID":"77902092-381a-451e-b462-de97532929fe","Type":"ContainerDied","Data":"cec5dafa92d98fbda7e451dd6b304ecce4b70f998e6b3713d8060bd1fb16bdc8"} Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.942543 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" event={"ID":"db4cb93f-b00a-42b1-b5d2-704e6c6010c6","Type":"ContainerStarted","Data":"079886b127f56364cb4f4a2af4404f3c7d2b812c2f7a3bde1b68aee6707705ba"} Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.942755 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.957037 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-mfhl4" event={"ID":"7fb5f55e-6d4f-465a-9e93-45a36a2ce4b3","Type":"ContainerStarted","Data":"51e90351ef7bd44d08f6a8388f22f9009a2f1812735662d94b322a17fefc008d"} Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.958414 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-mfhl4" Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.958773 4722 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-ljxl7 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.12:6443/healthz\": dial tcp 10.217.0.12:6443: connect: connection refused" start-of-body= Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.960256 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" podUID="db4cb93f-b00a-42b1-b5d2-704e6c6010c6" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.12:6443/healthz\": dial tcp 10.217.0.12:6443: connect: connection refused" Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.959769 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:50 crc kubenswrapper[4722]: E1002 09:34:50.959867 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:51.459843843 +0000 UTC m=+147.496596823 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.962783 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:50 crc kubenswrapper[4722]: E1002 09:34:50.963514 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:51.463496388 +0000 UTC m=+147.500249368 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.965108 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cr6pt" event={"ID":"9b92581b-78c7-41ba-adac-0e8336d81508","Type":"ContainerStarted","Data":"91bfa1a8a6d52b30e427fb6cabbcfb2f922b4af2e6e157db6e5a0651b7b4e1f5"} Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.967081 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vftjt" event={"ID":"a9fbcec9-73cc-4d46-88d5-3596af15db5b","Type":"ContainerStarted","Data":"9c4c9d7d3e57bbcce894264fb7a4c8bd83f6e2b88f8e27c00ba2ac9af1cf3e76"} Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.967238 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-clnnl" podStartSLOduration=123.967214745 podStartE2EDuration="2m3.967214745s" podCreationTimestamp="2025-10-02 09:32:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:50.950384648 +0000 UTC m=+146.987137638" watchObservedRunningTime="2025-10-02 09:34:50.967214745 +0000 UTC m=+147.003967715" Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.969335 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nvrnd" event={"ID":"84cae530-6db3-4e7e-bf5f-b4e25ce9d992","Type":"ContainerStarted","Data":"2ee7e8021cdb7d48527d72ff4f32d3c5c0ed390bf0e2f2195c24e2768bedc291"} Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.971543 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8fqr9" event={"ID":"e6051c5a-7df2-4c89-a0c8-03d2bd3c2cc2","Type":"ContainerStarted","Data":"012cb01998c590692434f00ae96d49402fa44b6dbc3f2edd4ff79fd1f8d32617"} Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.972022 4722 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-pf6dm container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.972093 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pf6dm" podUID="4104d0d5-52a2-47da-9cd0-86d4c52bc122" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.974555 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-mfhl4 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Oct 02 09:34:50 crc kubenswrapper[4722]: I1002 09:34:50.974649 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mfhl4" podUID="7fb5f55e-6d4f-465a-9e93-45a36a2ce4b3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Oct 02 09:34:51 crc kubenswrapper[4722]: I1002 09:34:51.005915 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-flf5c" podStartSLOduration=124.005896008 podStartE2EDuration="2m4.005896008s" podCreationTimestamp="2025-10-02 09:32:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:50.975038888 +0000 UTC m=+147.011791878" watchObservedRunningTime="2025-10-02 09:34:51.005896008 +0000 UTC m=+147.042648998" Oct 02 09:34:51 crc kubenswrapper[4722]: I1002 09:34:51.026334 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" podStartSLOduration=124.026280157 podStartE2EDuration="2m4.026280157s" podCreationTimestamp="2025-10-02 09:32:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:51.020011755 +0000 UTC m=+147.056764745" watchObservedRunningTime="2025-10-02 09:34:51.026280157 +0000 UTC m=+147.063033137" Oct 02 09:34:51 crc kubenswrapper[4722]: I1002 09:34:51.046928 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-mfhl4" podStartSLOduration=124.046894952 podStartE2EDuration="2m4.046894952s" podCreationTimestamp="2025-10-02 09:32:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:51.039173962 +0000 UTC m=+147.075926942" watchObservedRunningTime="2025-10-02 09:34:51.046894952 +0000 UTC m=+147.083647932" Oct 02 09:34:51 crc kubenswrapper[4722]: I1002 09:34:51.071228 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:51 crc kubenswrapper[4722]: E1002 09:34:51.072977 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:51.572951528 +0000 UTC m=+147.609704508 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:51 crc kubenswrapper[4722]: I1002 09:34:51.073634 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vftjt" podStartSLOduration=123.073619856 podStartE2EDuration="2m3.073619856s" podCreationTimestamp="2025-10-02 09:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:51.07186575 +0000 UTC m=+147.108618730" watchObservedRunningTime="2025-10-02 09:34:51.073619856 +0000 UTC m=+147.110372836" Oct 02 09:34:51 crc kubenswrapper[4722]: I1002 09:34:51.173294 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:51 crc kubenswrapper[4722]: E1002 09:34:51.173838 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:51.673827066 +0000 UTC m=+147.710580046 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:51 crc kubenswrapper[4722]: I1002 09:34:51.274638 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:51 crc kubenswrapper[4722]: E1002 09:34:51.274839 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:51.774805246 +0000 UTC m=+147.811558236 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:51 crc kubenswrapper[4722]: I1002 09:34:51.274949 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:51 crc kubenswrapper[4722]: E1002 09:34:51.275378 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:51.775369131 +0000 UTC m=+147.812122191 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:51 crc kubenswrapper[4722]: I1002 09:34:51.375754 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:51 crc kubenswrapper[4722]: E1002 09:34:51.375884 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:51.875864979 +0000 UTC m=+147.912617959 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:51 crc kubenswrapper[4722]: I1002 09:34:51.376115 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:51 crc kubenswrapper[4722]: E1002 09:34:51.376458 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:51.876448404 +0000 UTC m=+147.913201374 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:51 crc kubenswrapper[4722]: I1002 09:34:51.477094 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:51 crc kubenswrapper[4722]: E1002 09:34:51.477389 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:51.977348222 +0000 UTC m=+148.014101202 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:51 crc kubenswrapper[4722]: I1002 09:34:51.477523 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:51 crc kubenswrapper[4722]: E1002 09:34:51.478059 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:51.97804639 +0000 UTC m=+148.014799410 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:51 crc kubenswrapper[4722]: I1002 09:34:51.578884 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:51 crc kubenswrapper[4722]: E1002 09:34:51.579108 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:52.079066882 +0000 UTC m=+148.115819862 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:51 crc kubenswrapper[4722]: I1002 09:34:51.579406 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:51 crc kubenswrapper[4722]: E1002 09:34:51.580042 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:52.080031127 +0000 UTC m=+148.116784217 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:51 crc kubenswrapper[4722]: I1002 09:34:51.680336 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:51 crc kubenswrapper[4722]: E1002 09:34:51.680579 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:52.180535835 +0000 UTC m=+148.217288815 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:51 crc kubenswrapper[4722]: I1002 09:34:51.680674 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:51 crc kubenswrapper[4722]: E1002 09:34:51.681191 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:52.181170961 +0000 UTC m=+148.217923941 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:51 crc kubenswrapper[4722]: I1002 09:34:51.720197 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-6xfhq" Oct 02 09:34:51 crc kubenswrapper[4722]: I1002 09:34:51.721962 4722 patch_prober.go:28] interesting pod/router-default-5444994796-6xfhq container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Oct 02 09:34:51 crc kubenswrapper[4722]: I1002 09:34:51.722030 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xfhq" podUID="fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Oct 02 09:34:51 crc kubenswrapper[4722]: I1002 09:34:51.782034 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:51 crc kubenswrapper[4722]: E1002 09:34:51.782215 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:52.282187072 +0000 UTC m=+148.318940052 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:51 crc kubenswrapper[4722]: I1002 09:34:51.782299 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:51 crc kubenswrapper[4722]: E1002 09:34:51.782615 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:52.282602263 +0000 UTC m=+148.319355243 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:51 crc kubenswrapper[4722]: I1002 09:34:51.883512 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:51 crc kubenswrapper[4722]: E1002 09:34:51.883653 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:52.383625884 +0000 UTC m=+148.420378864 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:51 crc kubenswrapper[4722]: I1002 09:34:51.884465 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:51 crc kubenswrapper[4722]: E1002 09:34:51.885348 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:52.385304607 +0000 UTC m=+148.422057767 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:51 crc kubenswrapper[4722]: I1002 09:34:51.984834 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qtsxk" event={"ID":"b59e4a45-c4df-4754-8e3e-83e750aee402","Type":"ContainerStarted","Data":"89a4fb98e6dccc5f688f2bc02950b914caf07cde7ebe25a62e8ea622d5b2e78f"} Oct 02 09:34:51 crc kubenswrapper[4722]: I1002 09:34:51.985146 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:51 crc kubenswrapper[4722]: I1002 09:34:51.985172 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-qtsxk" Oct 02 09:34:51 crc kubenswrapper[4722]: E1002 09:34:51.985533 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:52.485490787 +0000 UTC m=+148.522243767 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:51 crc kubenswrapper[4722]: I1002 09:34:51.985890 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:51 crc kubenswrapper[4722]: E1002 09:34:51.986395 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:52.48638251 +0000 UTC m=+148.523135490 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:51 crc kubenswrapper[4722]: I1002 09:34:51.988107 4722 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-qtsxk container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Oct 02 09:34:51 crc kubenswrapper[4722]: I1002 09:34:51.988156 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-qtsxk" podUID="b59e4a45-c4df-4754-8e3e-83e750aee402" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Oct 02 09:34:51 crc kubenswrapper[4722]: I1002 09:34:51.989756 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-87568" event={"ID":"69b26642-0e96-47a5-9154-9866670cb9e4","Type":"ContainerStarted","Data":"b2fc15d97acb34d711ad72413ab9cb2a5fbd66d37a8fbc1584d89c3897ab1757"} Oct 02 09:34:51 crc kubenswrapper[4722]: I1002 09:34:51.992153 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-42lsk" event={"ID":"ddbb576c-7cc0-4b9e-89b2-6ca265b6aa6b","Type":"ContainerStarted","Data":"e74c1f1959074aa6114c52981c3d872a3732b564677c07f902b37328689a0a1a"} Oct 02 09:34:51 crc kubenswrapper[4722]: I1002 09:34:51.994477 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b2tld" event={"ID":"2eb914e0-ee72-482c-a8d0-ba6b8f9f95d1","Type":"ContainerStarted","Data":"9915ff4b10d3e606165c121ab0793b61a395aae3b3b2847518fe8a2556658f0c"} Oct 02 09:34:51 crc kubenswrapper[4722]: I1002 09:34:51.995037 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b2tld" Oct 02 09:34:51 crc kubenswrapper[4722]: I1002 09:34:51.996052 4722 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-b2tld container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:5443/healthz\": dial tcp 10.217.0.33:5443: connect: connection refused" start-of-body= Oct 02 09:34:51 crc kubenswrapper[4722]: I1002 09:34:51.996101 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b2tld" podUID="2eb914e0-ee72-482c-a8d0-ba6b8f9f95d1" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.33:5443/healthz\": dial tcp 10.217.0.33:5443: connect: connection refused" Oct 02 09:34:51 crc kubenswrapper[4722]: I1002 09:34:51.997638 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-r8rd5" event={"ID":"e5de8035-7427-4646-ad95-13c58ecad616","Type":"ContainerStarted","Data":"792d8b15c7a0a429bd0c5a865b06743574c9e6ad459ad274f453d95adf952843"} Oct 02 09:34:51 crc kubenswrapper[4722]: I1002 09:34:51.999299 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-xkx2c" event={"ID":"28604003-081b-4192-af14-bab8aab39d15","Type":"ContainerStarted","Data":"68ae9a290df6d768660e1c3108b68ee64c12ad02851d9aa5f42753c9105d3dd0"} Oct 02 09:34:52 crc kubenswrapper[4722]: I1002 09:34:52.004263 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-c5szj" event={"ID":"92d2f480-8dcd-483c-ba02-41a2027b12ae","Type":"ContainerStarted","Data":"b2b0aea1960302df5d0bb305ee8d2a0151ee631853140fa4ee0f19de8e8ba890"} Oct 02 09:34:52 crc kubenswrapper[4722]: I1002 09:34:52.005457 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-45mp5" event={"ID":"4d79d2f7-de31-4ae6-b630-762dd6a5bef3","Type":"ContainerStarted","Data":"4819bd318a02796b96234b1584d09a80acb1fff32585b7904608e0f62e0678b3"} Oct 02 09:34:52 crc kubenswrapper[4722]: I1002 09:34:52.008878 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-954gs" event={"ID":"4de37de4-2c3b-4742-96fe-e63227dfc04b","Type":"ContainerStarted","Data":"607d814c97f1d03406a77ad7385f7e7fbd45807ee034a9ac94b0659c2df8ed60"} Oct 02 09:34:52 crc kubenswrapper[4722]: I1002 09:34:52.010884 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-7ghd2" event={"ID":"bf4132d7-ccbc-4e9a-adb2-2b92ee9165d8","Type":"ContainerStarted","Data":"84c9c5e3c718d803c5be78b6aebeb31a8aa51247ea462f189d584ef781e809f5"} Oct 02 09:34:52 crc kubenswrapper[4722]: I1002 09:34:52.012454 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nhcxl" event={"ID":"5c5d5270-f70e-4691-8bbd-ed6bbf5ac09d","Type":"ContainerStarted","Data":"a328f034ed0a26d92217f247cebfa5ef21bd6849d02b07f5eab3b4e8c1e16cc6"} Oct 02 09:34:52 crc kubenswrapper[4722]: I1002 09:34:52.013994 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zk6t7" event={"ID":"e6070617-4dba-40fb-a6a8-03f6807a6846","Type":"ContainerStarted","Data":"b0fb8662bd4f1081d8140dd72cb713be6e97e74a296372d38b1677e1d3793b7b"} Oct 02 09:34:52 crc kubenswrapper[4722]: I1002 09:34:52.015676 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323290-fh7nk" event={"ID":"d30e5a09-5492-426b-89a9-2331cce3cc87","Type":"ContainerStarted","Data":"832518e0d992589482b66314149f1878d43878e531b6241ec07f251280a5d918"} Oct 02 09:34:52 crc kubenswrapper[4722]: I1002 09:34:52.017195 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kbnrw" event={"ID":"2537cd59-b6d6-46da-a99f-3f6306a19fdb","Type":"ContainerStarted","Data":"9a36feb2116a5dc9f58948ef5c061c7f5dfdd658bad6a490e5fd0ba58c2e7db9"} Oct 02 09:34:52 crc kubenswrapper[4722]: I1002 09:34:52.018889 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fm8qg" event={"ID":"eb3e9b2a-3512-4b19-90d9-24bdbeafbd54","Type":"ContainerStarted","Data":"cddd1e7449c3671e958e2679011c906b1bb56d575cdf476d2df6dbb7696bf482"} Oct 02 09:34:52 crc kubenswrapper[4722]: I1002 09:34:52.020226 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hwbdb" event={"ID":"cb13bcef-7d21-460c-851d-96eeb995f612","Type":"ContainerStarted","Data":"0afa78e2bb25723b01c5cb9b55108c7cbcc3a7b647c9954ae83bbb8d231b64f3"} Oct 02 09:34:52 crc kubenswrapper[4722]: I1002 09:34:52.021630 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hpfrs" event={"ID":"bf859e5d-c395-4892-9abf-e1a9a9d21176","Type":"ContainerStarted","Data":"edfde6d8bdca3e8a9af682ec8dcef91ef63740d1d6c7d61a2e38cb53a3c76ff5"} Oct 02 09:34:52 crc kubenswrapper[4722]: I1002 09:34:52.024383 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-qtsxk" podStartSLOduration=124.024368996 podStartE2EDuration="2m4.024368996s" podCreationTimestamp="2025-10-02 09:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:52.003866194 +0000 UTC m=+148.040619174" watchObservedRunningTime="2025-10-02 09:34:52.024368996 +0000 UTC m=+148.061121976" Oct 02 09:34:52 crc kubenswrapper[4722]: I1002 09:34:52.025880 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b2tld" podStartSLOduration=124.025872675 podStartE2EDuration="2m4.025872675s" podCreationTimestamp="2025-10-02 09:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:52.025585157 +0000 UTC m=+148.062338147" watchObservedRunningTime="2025-10-02 09:34:52.025872675 +0000 UTC m=+148.062625665" Oct 02 09:34:52 crc kubenswrapper[4722]: I1002 09:34:52.027485 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fpttx" event={"ID":"77902092-381a-451e-b462-de97532929fe","Type":"ContainerStarted","Data":"3005a617c69347eeefe8fda5fd0e82a4bf81cd8eec9720aee1cddd06e52833f6"} Oct 02 09:34:52 crc kubenswrapper[4722]: I1002 09:34:52.032247 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-krj4j" event={"ID":"86bc5bcd-672b-4bb6-8090-1ed2d77fff50","Type":"ContainerStarted","Data":"a9550a7e38aa9c366a6b229e238809e8032d62374d9f9f266cc3e5de359cad97"} Oct 02 09:34:52 crc kubenswrapper[4722]: I1002 09:34:52.033704 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d4656" event={"ID":"f9365ecd-353b-4623-a174-45d558e9d16d","Type":"ContainerStarted","Data":"c21b652b821b29f9e133516dadcf8df56cceb6b09699bb5f634a3da3ba84db7f"} Oct 02 09:34:52 crc kubenswrapper[4722]: I1002 09:34:52.035179 4722 generic.go:334] "Generic (PLEG): container finished" podID="722a2a42-6f33-4002-ab72-3544df821d3b" containerID="8e89e3a99e487ca2c19ca66e6f1bf67c295780e9914a9bcccbe44e46b09645c9" exitCode=0 Oct 02 09:34:52 crc kubenswrapper[4722]: I1002 09:34:52.036533 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jkt7j" event={"ID":"722a2a42-6f33-4002-ab72-3544df821d3b","Type":"ContainerDied","Data":"8e89e3a99e487ca2c19ca66e6f1bf67c295780e9914a9bcccbe44e46b09645c9"} Oct 02 09:34:52 crc kubenswrapper[4722]: I1002 09:34:52.036656 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-mfhl4 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Oct 02 09:34:52 crc kubenswrapper[4722]: I1002 09:34:52.036759 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mfhl4" podUID="7fb5f55e-6d4f-465a-9e93-45a36a2ce4b3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Oct 02 09:34:52 crc kubenswrapper[4722]: I1002 09:34:52.038153 4722 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-ljxl7 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.12:6443/healthz\": dial tcp 10.217.0.12:6443: connect: connection refused" start-of-body= Oct 02 09:34:52 crc kubenswrapper[4722]: I1002 09:34:52.038194 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" podUID="db4cb93f-b00a-42b1-b5d2-704e6c6010c6" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.12:6443/healthz\": dial tcp 10.217.0.12:6443: connect: connection refused" Oct 02 09:34:52 crc kubenswrapper[4722]: I1002 09:34:52.044716 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29323290-fh7nk" podStartSLOduration=125.044690503 podStartE2EDuration="2m5.044690503s" podCreationTimestamp="2025-10-02 09:32:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:52.044343984 +0000 UTC m=+148.081096964" watchObservedRunningTime="2025-10-02 09:34:52.044690503 +0000 UTC m=+148.081443483" Oct 02 09:34:52 crc kubenswrapper[4722]: I1002 09:34:52.059848 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-h5twj" podStartSLOduration=124.059822546 podStartE2EDuration="2m4.059822546s" podCreationTimestamp="2025-10-02 09:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:52.058931342 +0000 UTC m=+148.095684322" watchObservedRunningTime="2025-10-02 09:34:52.059822546 +0000 UTC m=+148.096575516" Oct 02 09:34:52 crc kubenswrapper[4722]: I1002 09:34:52.073104 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d4656" podStartSLOduration=125.07308557 podStartE2EDuration="2m5.07308557s" podCreationTimestamp="2025-10-02 09:32:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:52.072493304 +0000 UTC m=+148.109246284" watchObservedRunningTime="2025-10-02 09:34:52.07308557 +0000 UTC m=+148.109838550" Oct 02 09:34:52 crc kubenswrapper[4722]: I1002 09:34:52.087151 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:52 crc kubenswrapper[4722]: E1002 09:34:52.087374 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:52.58733914 +0000 UTC m=+148.624092120 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:52 crc kubenswrapper[4722]: I1002 09:34:52.091562 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:52 crc kubenswrapper[4722]: E1002 09:34:52.093496 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:52.593482019 +0000 UTC m=+148.630234999 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:52 crc kubenswrapper[4722]: I1002 09:34:52.099418 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nvrnd" podStartSLOduration=124.099381852 podStartE2EDuration="2m4.099381852s" podCreationTimestamp="2025-10-02 09:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:52.098076668 +0000 UTC m=+148.134829648" watchObservedRunningTime="2025-10-02 09:34:52.099381852 +0000 UTC m=+148.136134842" Oct 02 09:34:52 crc kubenswrapper[4722]: I1002 09:34:52.166547 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-422gm" podStartSLOduration=124.166524554 podStartE2EDuration="2m4.166524554s" podCreationTimestamp="2025-10-02 09:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:52.166069713 +0000 UTC m=+148.202822703" watchObservedRunningTime="2025-10-02 09:34:52.166524554 +0000 UTC m=+148.203277554" Oct 02 09:34:52 crc kubenswrapper[4722]: I1002 09:34:52.193764 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:52 crc kubenswrapper[4722]: E1002 09:34:52.199105 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:52.699079899 +0000 UTC m=+148.735832889 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:52 crc kubenswrapper[4722]: I1002 09:34:52.216168 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-8fqr9" podStartSLOduration=7.216147392 podStartE2EDuration="7.216147392s" podCreationTimestamp="2025-10-02 09:34:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:52.208421062 +0000 UTC m=+148.245174042" watchObservedRunningTime="2025-10-02 09:34:52.216147392 +0000 UTC m=+148.252900372" Oct 02 09:34:52 crc kubenswrapper[4722]: I1002 09:34:52.235720 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-l7mvw" podStartSLOduration=7.234290493 podStartE2EDuration="7.234290493s" podCreationTimestamp="2025-10-02 09:34:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:52.233561024 +0000 UTC m=+148.270314014" watchObservedRunningTime="2025-10-02 09:34:52.234290493 +0000 UTC m=+148.271043473" Oct 02 09:34:52 crc kubenswrapper[4722]: I1002 09:34:52.252708 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cr6pt" podStartSLOduration=125.25268855 podStartE2EDuration="2m5.25268855s" podCreationTimestamp="2025-10-02 09:32:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:52.250611956 +0000 UTC m=+148.287364936" watchObservedRunningTime="2025-10-02 09:34:52.25268855 +0000 UTC m=+148.289441530" Oct 02 09:34:52 crc kubenswrapper[4722]: I1002 09:34:52.273954 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5bhmb" podStartSLOduration=124.273933182 podStartE2EDuration="2m4.273933182s" podCreationTimestamp="2025-10-02 09:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:52.270991435 +0000 UTC m=+148.307744435" watchObservedRunningTime="2025-10-02 09:34:52.273933182 +0000 UTC m=+148.310686162" Oct 02 09:34:52 crc kubenswrapper[4722]: I1002 09:34:52.313949 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:52 crc kubenswrapper[4722]: E1002 09:34:52.314557 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:52.814545145 +0000 UTC m=+148.851298125 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:52 crc kubenswrapper[4722]: I1002 09:34:52.417720 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:52 crc kubenswrapper[4722]: E1002 09:34:52.417926 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:52.917889077 +0000 UTC m=+148.954642057 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:52 crc kubenswrapper[4722]: I1002 09:34:52.418303 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:52 crc kubenswrapper[4722]: E1002 09:34:52.418728 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:52.918719699 +0000 UTC m=+148.955472679 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:52 crc kubenswrapper[4722]: I1002 09:34:52.519957 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:52 crc kubenswrapper[4722]: E1002 09:34:52.520178 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:53.02014016 +0000 UTC m=+149.056893140 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:52 crc kubenswrapper[4722]: I1002 09:34:52.521212 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:52 crc kubenswrapper[4722]: E1002 09:34:52.521751 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:53.021715071 +0000 UTC m=+149.058468051 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:52 crc kubenswrapper[4722]: I1002 09:34:52.622456 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:52 crc kubenswrapper[4722]: E1002 09:34:52.622677 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:53.12264808 +0000 UTC m=+149.159401060 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:52 crc kubenswrapper[4722]: I1002 09:34:52.622770 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:52 crc kubenswrapper[4722]: E1002 09:34:52.623338 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:53.123290337 +0000 UTC m=+149.160043497 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:52 crc kubenswrapper[4722]: I1002 09:34:52.721566 4722 patch_prober.go:28] interesting pod/router-default-5444994796-6xfhq container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Oct 02 09:34:52 crc kubenswrapper[4722]: I1002 09:34:52.721660 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xfhq" podUID="fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Oct 02 09:34:52 crc kubenswrapper[4722]: I1002 09:34:52.725039 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:52 crc kubenswrapper[4722]: E1002 09:34:52.725250 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:53.225227582 +0000 UTC m=+149.261980562 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:52 crc kubenswrapper[4722]: I1002 09:34:52.725401 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:52 crc kubenswrapper[4722]: E1002 09:34:52.725778 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:53.225757786 +0000 UTC m=+149.262510766 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:52 crc kubenswrapper[4722]: I1002 09:34:52.841998 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:52 crc kubenswrapper[4722]: E1002 09:34:52.842915 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:53.342895385 +0000 UTC m=+149.379648365 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:52 crc kubenswrapper[4722]: I1002 09:34:52.944955 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:52 crc kubenswrapper[4722]: E1002 09:34:52.945605 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:53.445561689 +0000 UTC m=+149.482314829 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:53 crc kubenswrapper[4722]: I1002 09:34:53.045999 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:53 crc kubenswrapper[4722]: E1002 09:34:53.046205 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:53.54616673 +0000 UTC m=+149.582919730 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:53 crc kubenswrapper[4722]: I1002 09:34:53.046682 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:53 crc kubenswrapper[4722]: E1002 09:34:53.047102 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:53.547092124 +0000 UTC m=+149.583845104 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:53 crc kubenswrapper[4722]: I1002 09:34:53.052810 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jkt7j" event={"ID":"722a2a42-6f33-4002-ab72-3544df821d3b","Type":"ContainerStarted","Data":"c16d69e60d0b65d547720368fc6bd3b0a7913fde291b507ffeb145778fe038ee"} Oct 02 09:34:53 crc kubenswrapper[4722]: I1002 09:34:53.057102 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hwbdb" event={"ID":"cb13bcef-7d21-460c-851d-96eeb995f612","Type":"ContainerStarted","Data":"f4d38a3cd81bf215608b4ada102e55ccc8780c8830d77e345650d9f4ed2c9328"} Oct 02 09:34:53 crc kubenswrapper[4722]: I1002 09:34:53.061755 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kbnrw" event={"ID":"2537cd59-b6d6-46da-a99f-3f6306a19fdb","Type":"ContainerStarted","Data":"11f8a6cb370b87af2a19ce4eb93b578dedfb9ae7721273356d1813d945bb8432"} Oct 02 09:34:53 crc kubenswrapper[4722]: I1002 09:34:53.063443 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kbnrw" Oct 02 09:34:53 crc kubenswrapper[4722]: I1002 09:34:53.071195 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z5zmm" event={"ID":"ebdfe0bc-aee2-4b9e-bea3-e226541a08ea","Type":"ContainerStarted","Data":"694c5322c3a554eafd679658bf7eb490c69be6da99ee75dfddb99d569fe28f6d"} Oct 02 09:34:53 crc kubenswrapper[4722]: I1002 09:34:53.076656 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-87568" event={"ID":"69b26642-0e96-47a5-9154-9866670cb9e4","Type":"ContainerStarted","Data":"2dcf612a036ddbc0fa6d9fa09f8f29c0fb800d3169b124498a794cc70b36b0d3"} Oct 02 09:34:53 crc kubenswrapper[4722]: I1002 09:34:53.083176 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-xpzp4" event={"ID":"5fcdab41-6e2e-46fe-b9d0-1411949a51e9","Type":"ContainerStarted","Data":"71b9da21724f05088be420330bc9d11e0c001ae3e1f9615cf1dc9238cdd6e575"} Oct 02 09:34:53 crc kubenswrapper[4722]: I1002 09:34:53.085088 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-r8rd5" Oct 02 09:34:53 crc kubenswrapper[4722]: I1002 09:34:53.086898 4722 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-qtsxk container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Oct 02 09:34:53 crc kubenswrapper[4722]: I1002 09:34:53.086946 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-qtsxk" podUID="b59e4a45-c4df-4754-8e3e-83e750aee402" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Oct 02 09:34:53 crc kubenswrapper[4722]: I1002 09:34:53.087471 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kbnrw" podStartSLOduration=125.087449321 podStartE2EDuration="2m5.087449321s" podCreationTimestamp="2025-10-02 09:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:53.085190803 +0000 UTC m=+149.121943803" watchObservedRunningTime="2025-10-02 09:34:53.087449321 +0000 UTC m=+149.124202301" Oct 02 09:34:53 crc kubenswrapper[4722]: I1002 09:34:53.087719 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-zk6t7" Oct 02 09:34:53 crc kubenswrapper[4722]: I1002 09:34:53.090537 4722 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-b2tld container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:5443/healthz\": dial tcp 10.217.0.33:5443: connect: connection refused" start-of-body= Oct 02 09:34:53 crc kubenswrapper[4722]: I1002 09:34:53.090738 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b2tld" podUID="2eb914e0-ee72-482c-a8d0-ba6b8f9f95d1" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.33:5443/healthz\": dial tcp 10.217.0.33:5443: connect: connection refused" Oct 02 09:34:53 crc kubenswrapper[4722]: I1002 09:34:53.092638 4722 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-zk6t7 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Oct 02 09:34:53 crc kubenswrapper[4722]: I1002 09:34:53.092712 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-zk6t7" podUID="e6070617-4dba-40fb-a6a8-03f6807a6846" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Oct 02 09:34:53 crc kubenswrapper[4722]: I1002 09:34:53.109414 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-krj4j" podStartSLOduration=125.109251737 podStartE2EDuration="2m5.109251737s" podCreationTimestamp="2025-10-02 09:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:53.105165881 +0000 UTC m=+149.141918871" watchObservedRunningTime="2025-10-02 09:34:53.109251737 +0000 UTC m=+149.146004707" Oct 02 09:34:53 crc kubenswrapper[4722]: I1002 09:34:53.147825 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:53 crc kubenswrapper[4722]: E1002 09:34:53.151653 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:53.651616116 +0000 UTC m=+149.688369106 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:53 crc kubenswrapper[4722]: I1002 09:34:53.182098 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-xkx2c" podStartSLOduration=126.182070037 podStartE2EDuration="2m6.182070037s" podCreationTimestamp="2025-10-02 09:32:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:53.156957025 +0000 UTC m=+149.193710025" watchObservedRunningTime="2025-10-02 09:34:53.182070037 +0000 UTC m=+149.218823017" Oct 02 09:34:53 crc kubenswrapper[4722]: I1002 09:34:53.206211 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hpfrs" podStartSLOduration=125.206178852 podStartE2EDuration="2m5.206178852s" podCreationTimestamp="2025-10-02 09:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:53.194807407 +0000 UTC m=+149.231560387" watchObservedRunningTime="2025-10-02 09:34:53.206178852 +0000 UTC m=+149.242931832" Oct 02 09:34:53 crc kubenswrapper[4722]: I1002 09:34:53.253870 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:53 crc kubenswrapper[4722]: E1002 09:34:53.257032 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:53.756996081 +0000 UTC m=+149.793749061 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:53 crc kubenswrapper[4722]: I1002 09:34:53.268851 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-954gs" podStartSLOduration=126.268823458 podStartE2EDuration="2m6.268823458s" podCreationTimestamp="2025-10-02 09:32:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:53.23228487 +0000 UTC m=+149.269037850" watchObservedRunningTime="2025-10-02 09:34:53.268823458 +0000 UTC m=+149.305576438" Oct 02 09:34:53 crc kubenswrapper[4722]: I1002 09:34:53.282638 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-r8rd5" podStartSLOduration=126.282603105 podStartE2EDuration="2m6.282603105s" podCreationTimestamp="2025-10-02 09:32:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:53.267256267 +0000 UTC m=+149.304009267" watchObservedRunningTime="2025-10-02 09:34:53.282603105 +0000 UTC m=+149.319356085" Oct 02 09:34:53 crc kubenswrapper[4722]: I1002 09:34:53.294521 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-xpzp4" podStartSLOduration=126.294499124 podStartE2EDuration="2m6.294499124s" podCreationTimestamp="2025-10-02 09:32:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:53.292919393 +0000 UTC m=+149.329672383" watchObservedRunningTime="2025-10-02 09:34:53.294499124 +0000 UTC m=+149.331252104" Oct 02 09:34:53 crc kubenswrapper[4722]: I1002 09:34:53.326369 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-87568" podStartSLOduration=125.32634308 podStartE2EDuration="2m5.32634308s" podCreationTimestamp="2025-10-02 09:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:53.325215661 +0000 UTC m=+149.361968651" watchObservedRunningTime="2025-10-02 09:34:53.32634308 +0000 UTC m=+149.363096060" Oct 02 09:34:53 crc kubenswrapper[4722]: I1002 09:34:53.355964 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:53 crc kubenswrapper[4722]: E1002 09:34:53.356437 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:53.856419511 +0000 UTC m=+149.893172491 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:53 crc kubenswrapper[4722]: I1002 09:34:53.389208 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z5zmm" podStartSLOduration=126.389184041 podStartE2EDuration="2m6.389184041s" podCreationTimestamp="2025-10-02 09:32:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:53.353805183 +0000 UTC m=+149.390558183" watchObservedRunningTime="2025-10-02 09:34:53.389184041 +0000 UTC m=+149.425937021" Oct 02 09:34:53 crc kubenswrapper[4722]: I1002 09:34:53.390765 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-c5szj" podStartSLOduration=125.390760772 podStartE2EDuration="2m5.390760772s" podCreationTimestamp="2025-10-02 09:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:53.387818105 +0000 UTC m=+149.424571085" watchObservedRunningTime="2025-10-02 09:34:53.390760772 +0000 UTC m=+149.427513752" Oct 02 09:34:53 crc kubenswrapper[4722]: I1002 09:34:53.458614 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:53 crc kubenswrapper[4722]: E1002 09:34:53.459004 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:53.958989802 +0000 UTC m=+149.995742782 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:53 crc kubenswrapper[4722]: I1002 09:34:53.489898 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-zk6t7" podStartSLOduration=126.489866693 podStartE2EDuration="2m6.489866693s" podCreationTimestamp="2025-10-02 09:32:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:53.431814027 +0000 UTC m=+149.468567007" watchObservedRunningTime="2025-10-02 09:34:53.489866693 +0000 UTC m=+149.526619683" Oct 02 09:34:53 crc kubenswrapper[4722]: I1002 09:34:53.544394 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fm8qg" podStartSLOduration=125.544294916 podStartE2EDuration="2m5.544294916s" podCreationTimestamp="2025-10-02 09:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:53.542946801 +0000 UTC m=+149.579699781" watchObservedRunningTime="2025-10-02 09:34:53.544294916 +0000 UTC m=+149.581047896" Oct 02 09:34:53 crc kubenswrapper[4722]: I1002 09:34:53.550707 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fpttx" podStartSLOduration=125.550669771 podStartE2EDuration="2m5.550669771s" podCreationTimestamp="2025-10-02 09:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:53.494670108 +0000 UTC m=+149.531423088" watchObservedRunningTime="2025-10-02 09:34:53.550669771 +0000 UTC m=+149.587422761" Oct 02 09:34:53 crc kubenswrapper[4722]: I1002 09:34:53.561541 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:53 crc kubenswrapper[4722]: E1002 09:34:53.561766 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:54.061724838 +0000 UTC m=+150.098477818 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:53 crc kubenswrapper[4722]: I1002 09:34:53.561821 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:53 crc kubenswrapper[4722]: E1002 09:34:53.563628 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:54.063607907 +0000 UTC m=+150.100360887 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:53 crc kubenswrapper[4722]: I1002 09:34:53.570502 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-45mp5" podStartSLOduration=125.570481145 podStartE2EDuration="2m5.570481145s" podCreationTimestamp="2025-10-02 09:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:53.568914425 +0000 UTC m=+149.605667425" watchObservedRunningTime="2025-10-02 09:34:53.570481145 +0000 UTC m=+149.607234115" Oct 02 09:34:53 crc kubenswrapper[4722]: I1002 09:34:53.663009 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:53 crc kubenswrapper[4722]: E1002 09:34:53.663252 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:54.163215502 +0000 UTC m=+150.199968482 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:53 crc kubenswrapper[4722]: I1002 09:34:53.663369 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:53 crc kubenswrapper[4722]: E1002 09:34:53.663815 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:54.163799577 +0000 UTC m=+150.200552547 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:53 crc kubenswrapper[4722]: I1002 09:34:53.724450 4722 patch_prober.go:28] interesting pod/router-default-5444994796-6xfhq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 09:34:53 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Oct 02 09:34:53 crc kubenswrapper[4722]: [+]process-running ok Oct 02 09:34:53 crc kubenswrapper[4722]: healthz check failed Oct 02 09:34:53 crc kubenswrapper[4722]: I1002 09:34:53.724817 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xfhq" podUID="fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 09:34:53 crc kubenswrapper[4722]: I1002 09:34:53.764390 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:53 crc kubenswrapper[4722]: E1002 09:34:53.764593 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:54.264552741 +0000 UTC m=+150.301305721 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:53 crc kubenswrapper[4722]: I1002 09:34:53.764991 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:53 crc kubenswrapper[4722]: E1002 09:34:53.765403 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:54.265378773 +0000 UTC m=+150.302131753 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:53 crc kubenswrapper[4722]: I1002 09:34:53.866869 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:53 crc kubenswrapper[4722]: E1002 09:34:53.867102 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:54.367063081 +0000 UTC m=+150.403816061 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:53 crc kubenswrapper[4722]: I1002 09:34:53.867254 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:53 crc kubenswrapper[4722]: E1002 09:34:53.867668 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:54.367657237 +0000 UTC m=+150.404410277 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:53 crc kubenswrapper[4722]: I1002 09:34:53.968590 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:53 crc kubenswrapper[4722]: E1002 09:34:53.968825 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:54.468782141 +0000 UTC m=+150.505535141 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:53 crc kubenswrapper[4722]: I1002 09:34:53.969550 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:53 crc kubenswrapper[4722]: E1002 09:34:53.970009 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:54.470000922 +0000 UTC m=+150.506753902 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:54 crc kubenswrapper[4722]: I1002 09:34:54.070838 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:54 crc kubenswrapper[4722]: E1002 09:34:54.071096 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:54.571055685 +0000 UTC m=+150.607808675 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:54 crc kubenswrapper[4722]: I1002 09:34:54.071271 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:54 crc kubenswrapper[4722]: E1002 09:34:54.071775 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:54.571753503 +0000 UTC m=+150.608506543 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:54 crc kubenswrapper[4722]: I1002 09:34:54.090960 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-h645w" event={"ID":"2186d01d-5c96-4c50-9414-bce7cd2d7052","Type":"ContainerStarted","Data":"d0327e2f0f831d92e0d6cdabbd437249d956371aa06b8caf7a1216451edab4d9"} Oct 02 09:34:54 crc kubenswrapper[4722]: I1002 09:34:54.093454 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-7ghd2" event={"ID":"bf4132d7-ccbc-4e9a-adb2-2b92ee9165d8","Type":"ContainerStarted","Data":"9a02637b2fd1d22297c0b133cc22e59ccac932a9db624f477ba28a387746833e"} Oct 02 09:34:54 crc kubenswrapper[4722]: I1002 09:34:54.095388 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nhcxl" event={"ID":"5c5d5270-f70e-4691-8bbd-ed6bbf5ac09d","Type":"ContainerStarted","Data":"2fecb27f35b8f199ff1f21839e435782948b40cab6d5035bc730d88b2efa1ceb"} Oct 02 09:34:54 crc kubenswrapper[4722]: I1002 09:34:54.097219 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-42lsk" event={"ID":"ddbb576c-7cc0-4b9e-89b2-6ca265b6aa6b","Type":"ContainerStarted","Data":"07b9c54712211de66c92a5296f88416f84a3f0b3d7db3a96c331d59dd5d7cc9e"} Oct 02 09:34:54 crc kubenswrapper[4722]: I1002 09:34:54.100012 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jkt7j" event={"ID":"722a2a42-6f33-4002-ab72-3544df821d3b","Type":"ContainerStarted","Data":"8b8e49a48d2aa535fe46d8b80065648e8eed4363d9bf2757cafc09f4f5220f3e"} Oct 02 09:34:54 crc kubenswrapper[4722]: I1002 09:34:54.101810 4722 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-zk6t7 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Oct 02 09:34:54 crc kubenswrapper[4722]: I1002 09:34:54.101850 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-zk6t7" podUID="e6070617-4dba-40fb-a6a8-03f6807a6846" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Oct 02 09:34:54 crc kubenswrapper[4722]: I1002 09:34:54.102461 4722 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-r8rd5 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/healthz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Oct 02 09:34:54 crc kubenswrapper[4722]: I1002 09:34:54.102663 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-r8rd5" podUID="e5de8035-7427-4646-ad95-13c58ecad616" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/healthz\": dial tcp 10.217.0.14:8443: connect: connection refused" Oct 02 09:34:54 crc kubenswrapper[4722]: I1002 09:34:54.161072 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-jkt7j" podStartSLOduration=127.16104644 podStartE2EDuration="2m7.16104644s" podCreationTimestamp="2025-10-02 09:32:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:54.159811918 +0000 UTC m=+150.196564898" watchObservedRunningTime="2025-10-02 09:34:54.16104644 +0000 UTC m=+150.197799420" Oct 02 09:34:54 crc kubenswrapper[4722]: I1002 09:34:54.162848 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-7ghd2" podStartSLOduration=126.162839286 podStartE2EDuration="2m6.162839286s" podCreationTimestamp="2025-10-02 09:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:54.126045952 +0000 UTC m=+150.162798942" watchObservedRunningTime="2025-10-02 09:34:54.162839286 +0000 UTC m=+150.199592266" Oct 02 09:34:54 crc kubenswrapper[4722]: I1002 09:34:54.172705 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:54 crc kubenswrapper[4722]: E1002 09:34:54.173365 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:54.673349269 +0000 UTC m=+150.710102249 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:54 crc kubenswrapper[4722]: I1002 09:34:54.236389 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-hwbdb" podStartSLOduration=9.236362404 podStartE2EDuration="9.236362404s" podCreationTimestamp="2025-10-02 09:34:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:54.235584624 +0000 UTC m=+150.272337624" watchObservedRunningTime="2025-10-02 09:34:54.236362404 +0000 UTC m=+150.273115384" Oct 02 09:34:54 crc kubenswrapper[4722]: I1002 09:34:54.239029 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-42lsk" podStartSLOduration=126.239008113 podStartE2EDuration="2m6.239008113s" podCreationTimestamp="2025-10-02 09:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:54.18993925 +0000 UTC m=+150.226692230" watchObservedRunningTime="2025-10-02 09:34:54.239008113 +0000 UTC m=+150.275761093" Oct 02 09:34:54 crc kubenswrapper[4722]: I1002 09:34:54.269559 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-hwbdb" Oct 02 09:34:54 crc kubenswrapper[4722]: I1002 09:34:54.276586 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:54 crc kubenswrapper[4722]: E1002 09:34:54.277414 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:54.777384729 +0000 UTC m=+150.814137839 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:54 crc kubenswrapper[4722]: I1002 09:34:54.378641 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:54 crc kubenswrapper[4722]: E1002 09:34:54.378807 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:54.87878117 +0000 UTC m=+150.915534160 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:54 crc kubenswrapper[4722]: I1002 09:34:54.378948 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:54 crc kubenswrapper[4722]: E1002 09:34:54.379259 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:54.879250162 +0000 UTC m=+150.916003142 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:54 crc kubenswrapper[4722]: I1002 09:34:54.480574 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:54 crc kubenswrapper[4722]: E1002 09:34:54.481016 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:54.980995832 +0000 UTC m=+151.017748812 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:54 crc kubenswrapper[4722]: I1002 09:34:54.582118 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:34:54 crc kubenswrapper[4722]: I1002 09:34:54.582613 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:34:54 crc kubenswrapper[4722]: I1002 09:34:54.582737 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:54 crc kubenswrapper[4722]: I1002 09:34:54.582847 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:34:54 crc kubenswrapper[4722]: I1002 09:34:54.582939 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:34:54 crc kubenswrapper[4722]: E1002 09:34:54.584913 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:55.084889688 +0000 UTC m=+151.121642668 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:54 crc kubenswrapper[4722]: I1002 09:34:54.588443 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:34:54 crc kubenswrapper[4722]: I1002 09:34:54.612769 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:34:54 crc kubenswrapper[4722]: I1002 09:34:54.633660 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:34:54 crc kubenswrapper[4722]: I1002 09:34:54.635955 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:34:54 crc kubenswrapper[4722]: I1002 09:34:54.684783 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:54 crc kubenswrapper[4722]: E1002 09:34:54.685774 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:55.185749385 +0000 UTC m=+151.222502365 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:54 crc kubenswrapper[4722]: I1002 09:34:54.728436 4722 patch_prober.go:28] interesting pod/router-default-5444994796-6xfhq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 09:34:54 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Oct 02 09:34:54 crc kubenswrapper[4722]: [+]process-running ok Oct 02 09:34:54 crc kubenswrapper[4722]: healthz check failed Oct 02 09:34:54 crc kubenswrapper[4722]: I1002 09:34:54.728501 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xfhq" podUID="fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 09:34:54 crc kubenswrapper[4722]: I1002 09:34:54.752783 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nhcxl" podStartSLOduration=126.752749034 podStartE2EDuration="2m6.752749034s" podCreationTimestamp="2025-10-02 09:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:54.279993637 +0000 UTC m=+150.316746627" watchObservedRunningTime="2025-10-02 09:34:54.752749034 +0000 UTC m=+150.789502014" Oct 02 09:34:54 crc kubenswrapper[4722]: I1002 09:34:54.786860 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:54 crc kubenswrapper[4722]: E1002 09:34:54.787389 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:55.287364602 +0000 UTC m=+151.324117582 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:54 crc kubenswrapper[4722]: I1002 09:34:54.823447 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 09:34:54 crc kubenswrapper[4722]: I1002 09:34:54.837662 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:34:54 crc kubenswrapper[4722]: I1002 09:34:54.840358 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 09:34:54 crc kubenswrapper[4722]: I1002 09:34:54.890021 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:54 crc kubenswrapper[4722]: E1002 09:34:54.890473 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:55.390449837 +0000 UTC m=+151.427202817 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:54 crc kubenswrapper[4722]: I1002 09:34:54.993541 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:54 crc kubenswrapper[4722]: E1002 09:34:54.993941 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:55.493926842 +0000 UTC m=+151.530679822 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:55 crc kubenswrapper[4722]: I1002 09:34:55.022627 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b2tld" Oct 02 09:34:55 crc kubenswrapper[4722]: I1002 09:34:55.104302 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:55 crc kubenswrapper[4722]: E1002 09:34:55.105042 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:55.605021135 +0000 UTC m=+151.641774115 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:55 crc kubenswrapper[4722]: I1002 09:34:55.206302 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:55 crc kubenswrapper[4722]: E1002 09:34:55.206718 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:55.706705984 +0000 UTC m=+151.743458964 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:55 crc kubenswrapper[4722]: I1002 09:34:55.308273 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:55 crc kubenswrapper[4722]: E1002 09:34:55.310152 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:55.810133047 +0000 UTC m=+151.846886027 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:55 crc kubenswrapper[4722]: I1002 09:34:55.413111 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:55 crc kubenswrapper[4722]: E1002 09:34:55.413963 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:55.913947091 +0000 UTC m=+151.950700071 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:55 crc kubenswrapper[4722]: I1002 09:34:55.520967 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:55 crc kubenswrapper[4722]: E1002 09:34:55.521286 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:56.021263635 +0000 UTC m=+152.058016615 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:55 crc kubenswrapper[4722]: I1002 09:34:55.623992 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:55 crc kubenswrapper[4722]: E1002 09:34:55.634339 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:56.134301418 +0000 UTC m=+152.171054398 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:55 crc kubenswrapper[4722]: I1002 09:34:55.725407 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:55 crc kubenswrapper[4722]: E1002 09:34:55.726155 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:56.226127261 +0000 UTC m=+152.262880241 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:55 crc kubenswrapper[4722]: I1002 09:34:55.734890 4722 patch_prober.go:28] interesting pod/router-default-5444994796-6xfhq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 09:34:55 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Oct 02 09:34:55 crc kubenswrapper[4722]: [+]process-running ok Oct 02 09:34:55 crc kubenswrapper[4722]: healthz check failed Oct 02 09:34:55 crc kubenswrapper[4722]: I1002 09:34:55.735293 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xfhq" podUID="fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 09:34:55 crc kubenswrapper[4722]: I1002 09:34:55.802271 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 02 09:34:55 crc kubenswrapper[4722]: I1002 09:34:55.803097 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 09:34:55 crc kubenswrapper[4722]: I1002 09:34:55.805962 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 02 09:34:55 crc kubenswrapper[4722]: I1002 09:34:55.810503 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 02 09:34:55 crc kubenswrapper[4722]: I1002 09:34:55.814207 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 02 09:34:55 crc kubenswrapper[4722]: I1002 09:34:55.827070 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:55 crc kubenswrapper[4722]: E1002 09:34:55.827737 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:56.327706707 +0000 UTC m=+152.364459857 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:55 crc kubenswrapper[4722]: I1002 09:34:55.916975 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-p9hf8"] Oct 02 09:34:55 crc kubenswrapper[4722]: I1002 09:34:55.918332 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p9hf8" Oct 02 09:34:55 crc kubenswrapper[4722]: I1002 09:34:55.922338 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 02 09:34:55 crc kubenswrapper[4722]: I1002 09:34:55.927955 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:55 crc kubenswrapper[4722]: I1002 09:34:55.928222 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9dc19c16-fd8f-4ec7-9066-b18aff12d731-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9dc19c16-fd8f-4ec7-9066-b18aff12d731\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 09:34:55 crc kubenswrapper[4722]: I1002 09:34:55.928290 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9dc19c16-fd8f-4ec7-9066-b18aff12d731-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9dc19c16-fd8f-4ec7-9066-b18aff12d731\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 09:34:55 crc kubenswrapper[4722]: E1002 09:34:55.928447 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:56.42842986 +0000 UTC m=+152.465182830 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:55 crc kubenswrapper[4722]: I1002 09:34:55.934956 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p9hf8"] Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.031451 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e7f1ec0-f82a-4911-b120-ba1c25a4a200-catalog-content\") pod \"community-operators-p9hf8\" (UID: \"1e7f1ec0-f82a-4911-b120-ba1c25a4a200\") " pod="openshift-marketplace/community-operators-p9hf8" Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.032291 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9dc19c16-fd8f-4ec7-9066-b18aff12d731-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9dc19c16-fd8f-4ec7-9066-b18aff12d731\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.032439 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72slq\" (UniqueName: \"kubernetes.io/projected/1e7f1ec0-f82a-4911-b120-ba1c25a4a200-kube-api-access-72slq\") pod \"community-operators-p9hf8\" (UID: \"1e7f1ec0-f82a-4911-b120-ba1c25a4a200\") " pod="openshift-marketplace/community-operators-p9hf8" Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.032552 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.032664 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9dc19c16-fd8f-4ec7-9066-b18aff12d731-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9dc19c16-fd8f-4ec7-9066-b18aff12d731\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.032834 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e7f1ec0-f82a-4911-b120-ba1c25a4a200-utilities\") pod \"community-operators-p9hf8\" (UID: \"1e7f1ec0-f82a-4911-b120-ba1c25a4a200\") " pod="openshift-marketplace/community-operators-p9hf8" Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.033082 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9dc19c16-fd8f-4ec7-9066-b18aff12d731-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9dc19c16-fd8f-4ec7-9066-b18aff12d731\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 09:34:56 crc kubenswrapper[4722]: E1002 09:34:56.033387 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:56.533361753 +0000 UTC m=+152.570114913 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.046351 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9f2qr"] Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.052687 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9f2qr" Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.065218 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.067553 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9dc19c16-fd8f-4ec7-9066-b18aff12d731-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9dc19c16-fd8f-4ec7-9066-b18aff12d731\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.068762 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9f2qr"] Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.120906 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.137687 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.137856 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e7f1ec0-f82a-4911-b120-ba1c25a4a200-catalog-content\") pod \"community-operators-p9hf8\" (UID: \"1e7f1ec0-f82a-4911-b120-ba1c25a4a200\") " pod="openshift-marketplace/community-operators-p9hf8" Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.137894 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72slq\" (UniqueName: \"kubernetes.io/projected/1e7f1ec0-f82a-4911-b120-ba1c25a4a200-kube-api-access-72slq\") pod \"community-operators-p9hf8\" (UID: \"1e7f1ec0-f82a-4911-b120-ba1c25a4a200\") " pod="openshift-marketplace/community-operators-p9hf8" Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.137960 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e7f1ec0-f82a-4911-b120-ba1c25a4a200-utilities\") pod \"community-operators-p9hf8\" (UID: \"1e7f1ec0-f82a-4911-b120-ba1c25a4a200\") " pod="openshift-marketplace/community-operators-p9hf8" Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.138441 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e7f1ec0-f82a-4911-b120-ba1c25a4a200-utilities\") pod \"community-operators-p9hf8\" (UID: \"1e7f1ec0-f82a-4911-b120-ba1c25a4a200\") " pod="openshift-marketplace/community-operators-p9hf8" Oct 02 09:34:56 crc kubenswrapper[4722]: E1002 09:34:56.138524 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:56.638507772 +0000 UTC m=+152.675260752 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.138754 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e7f1ec0-f82a-4911-b120-ba1c25a4a200-catalog-content\") pod \"community-operators-p9hf8\" (UID: \"1e7f1ec0-f82a-4911-b120-ba1c25a4a200\") " pod="openshift-marketplace/community-operators-p9hf8" Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.156118 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"7cc571252dd7408c48b611a1c8042a285c48a66682f2ca72c3f5a15bdc228516"} Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.161896 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-r8rd5" Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.162485 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"ce9f416b6c0c9525b94152264fd60852411a6767facf99d56976514968368f43"} Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.165992 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72slq\" (UniqueName: \"kubernetes.io/projected/1e7f1ec0-f82a-4911-b120-ba1c25a4a200-kube-api-access-72slq\") pod \"community-operators-p9hf8\" (UID: \"1e7f1ec0-f82a-4911-b120-ba1c25a4a200\") " pod="openshift-marketplace/community-operators-p9hf8" Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.183071 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"a9dacd0db761eba2b272b068e5518094de5e9137aa349211a730a944e8782f95"} Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.240075 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c352a70-fb33-4e66-a950-0e831abd5d45-catalog-content\") pod \"certified-operators-9f2qr\" (UID: \"5c352a70-fb33-4e66-a950-0e831abd5d45\") " pod="openshift-marketplace/certified-operators-9f2qr" Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.240204 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c352a70-fb33-4e66-a950-0e831abd5d45-utilities\") pod \"certified-operators-9f2qr\" (UID: \"5c352a70-fb33-4e66-a950-0e831abd5d45\") " pod="openshift-marketplace/certified-operators-9f2qr" Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.240243 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cphrq\" (UniqueName: \"kubernetes.io/projected/5c352a70-fb33-4e66-a950-0e831abd5d45-kube-api-access-cphrq\") pod \"certified-operators-9f2qr\" (UID: \"5c352a70-fb33-4e66-a950-0e831abd5d45\") " pod="openshift-marketplace/certified-operators-9f2qr" Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.240326 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:56 crc kubenswrapper[4722]: E1002 09:34:56.240693 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:56.740676883 +0000 UTC m=+152.777429863 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.244190 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8zdkg"] Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.249341 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8zdkg" Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.260470 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p9hf8" Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.266343 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8zdkg"] Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.341735 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.341948 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c352a70-fb33-4e66-a950-0e831abd5d45-utilities\") pod \"certified-operators-9f2qr\" (UID: \"5c352a70-fb33-4e66-a950-0e831abd5d45\") " pod="openshift-marketplace/certified-operators-9f2qr" Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.341993 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cphrq\" (UniqueName: \"kubernetes.io/projected/5c352a70-fb33-4e66-a950-0e831abd5d45-kube-api-access-cphrq\") pod \"certified-operators-9f2qr\" (UID: \"5c352a70-fb33-4e66-a950-0e831abd5d45\") " pod="openshift-marketplace/certified-operators-9f2qr" Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.342118 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c352a70-fb33-4e66-a950-0e831abd5d45-catalog-content\") pod \"certified-operators-9f2qr\" (UID: \"5c352a70-fb33-4e66-a950-0e831abd5d45\") " pod="openshift-marketplace/certified-operators-9f2qr" Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.342703 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c352a70-fb33-4e66-a950-0e831abd5d45-catalog-content\") pod \"certified-operators-9f2qr\" (UID: \"5c352a70-fb33-4e66-a950-0e831abd5d45\") " pod="openshift-marketplace/certified-operators-9f2qr" Oct 02 09:34:56 crc kubenswrapper[4722]: E1002 09:34:56.342808 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:56.842786192 +0000 UTC m=+152.879539172 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.343077 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c352a70-fb33-4e66-a950-0e831abd5d45-utilities\") pod \"certified-operators-9f2qr\" (UID: \"5c352a70-fb33-4e66-a950-0e831abd5d45\") " pod="openshift-marketplace/certified-operators-9f2qr" Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.365652 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cphrq\" (UniqueName: \"kubernetes.io/projected/5c352a70-fb33-4e66-a950-0e831abd5d45-kube-api-access-cphrq\") pod \"certified-operators-9f2qr\" (UID: \"5c352a70-fb33-4e66-a950-0e831abd5d45\") " pod="openshift-marketplace/certified-operators-9f2qr" Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.381440 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9f2qr" Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.447945 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-n5224"] Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.456784 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea335a73-5af7-4bd2-8439-cfa2304b594e-catalog-content\") pod \"community-operators-8zdkg\" (UID: \"ea335a73-5af7-4bd2-8439-cfa2304b594e\") " pod="openshift-marketplace/community-operators-8zdkg" Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.458365 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.458414 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j26pq\" (UniqueName: \"kubernetes.io/projected/ea335a73-5af7-4bd2-8439-cfa2304b594e-kube-api-access-j26pq\") pod \"community-operators-8zdkg\" (UID: \"ea335a73-5af7-4bd2-8439-cfa2304b594e\") " pod="openshift-marketplace/community-operators-8zdkg" Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.458493 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea335a73-5af7-4bd2-8439-cfa2304b594e-utilities\") pod \"community-operators-8zdkg\" (UID: \"ea335a73-5af7-4bd2-8439-cfa2304b594e\") " pod="openshift-marketplace/community-operators-8zdkg" Oct 02 09:34:56 crc kubenswrapper[4722]: E1002 09:34:56.458965 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:56.958942127 +0000 UTC m=+152.995695107 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.466836 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n5224"] Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.467024 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n5224" Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.529697 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.571480 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.571838 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j26pq\" (UniqueName: \"kubernetes.io/projected/ea335a73-5af7-4bd2-8439-cfa2304b594e-kube-api-access-j26pq\") pod \"community-operators-8zdkg\" (UID: \"ea335a73-5af7-4bd2-8439-cfa2304b594e\") " pod="openshift-marketplace/community-operators-8zdkg" Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.571954 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1581831-b357-4fe6-8e8b-5bdbc057f867-catalog-content\") pod \"certified-operators-n5224\" (UID: \"d1581831-b357-4fe6-8e8b-5bdbc057f867\") " pod="openshift-marketplace/certified-operators-n5224" Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.572075 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1581831-b357-4fe6-8e8b-5bdbc057f867-utilities\") pod \"certified-operators-n5224\" (UID: \"d1581831-b357-4fe6-8e8b-5bdbc057f867\") " pod="openshift-marketplace/certified-operators-n5224" Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.572207 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea335a73-5af7-4bd2-8439-cfa2304b594e-utilities\") pod \"community-operators-8zdkg\" (UID: \"ea335a73-5af7-4bd2-8439-cfa2304b594e\") " pod="openshift-marketplace/community-operators-8zdkg" Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.572308 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea335a73-5af7-4bd2-8439-cfa2304b594e-catalog-content\") pod \"community-operators-8zdkg\" (UID: \"ea335a73-5af7-4bd2-8439-cfa2304b594e\") " pod="openshift-marketplace/community-operators-8zdkg" Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.572465 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gz5v5\" (UniqueName: \"kubernetes.io/projected/d1581831-b357-4fe6-8e8b-5bdbc057f867-kube-api-access-gz5v5\") pod \"certified-operators-n5224\" (UID: \"d1581831-b357-4fe6-8e8b-5bdbc057f867\") " pod="openshift-marketplace/certified-operators-n5224" Oct 02 09:34:56 crc kubenswrapper[4722]: E1002 09:34:56.572685 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:57.072660797 +0000 UTC m=+153.109413777 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.573705 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea335a73-5af7-4bd2-8439-cfa2304b594e-utilities\") pod \"community-operators-8zdkg\" (UID: \"ea335a73-5af7-4bd2-8439-cfa2304b594e\") " pod="openshift-marketplace/community-operators-8zdkg" Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.574053 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea335a73-5af7-4bd2-8439-cfa2304b594e-catalog-content\") pod \"community-operators-8zdkg\" (UID: \"ea335a73-5af7-4bd2-8439-cfa2304b594e\") " pod="openshift-marketplace/community-operators-8zdkg" Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.614259 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j26pq\" (UniqueName: \"kubernetes.io/projected/ea335a73-5af7-4bd2-8439-cfa2304b594e-kube-api-access-j26pq\") pod \"community-operators-8zdkg\" (UID: \"ea335a73-5af7-4bd2-8439-cfa2304b594e\") " pod="openshift-marketplace/community-operators-8zdkg" Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.674742 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gz5v5\" (UniqueName: \"kubernetes.io/projected/d1581831-b357-4fe6-8e8b-5bdbc057f867-kube-api-access-gz5v5\") pod \"certified-operators-n5224\" (UID: \"d1581831-b357-4fe6-8e8b-5bdbc057f867\") " pod="openshift-marketplace/certified-operators-n5224" Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.674811 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.674844 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1581831-b357-4fe6-8e8b-5bdbc057f867-catalog-content\") pod \"certified-operators-n5224\" (UID: \"d1581831-b357-4fe6-8e8b-5bdbc057f867\") " pod="openshift-marketplace/certified-operators-n5224" Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.674879 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1581831-b357-4fe6-8e8b-5bdbc057f867-utilities\") pod \"certified-operators-n5224\" (UID: \"d1581831-b357-4fe6-8e8b-5bdbc057f867\") " pod="openshift-marketplace/certified-operators-n5224" Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.675467 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1581831-b357-4fe6-8e8b-5bdbc057f867-utilities\") pod \"certified-operators-n5224\" (UID: \"d1581831-b357-4fe6-8e8b-5bdbc057f867\") " pod="openshift-marketplace/certified-operators-n5224" Oct 02 09:34:56 crc kubenswrapper[4722]: E1002 09:34:56.676053 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:57.17603787 +0000 UTC m=+153.212790850 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.676519 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1581831-b357-4fe6-8e8b-5bdbc057f867-catalog-content\") pod \"certified-operators-n5224\" (UID: \"d1581831-b357-4fe6-8e8b-5bdbc057f867\") " pod="openshift-marketplace/certified-operators-n5224" Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.707439 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gz5v5\" (UniqueName: \"kubernetes.io/projected/d1581831-b357-4fe6-8e8b-5bdbc057f867-kube-api-access-gz5v5\") pod \"certified-operators-n5224\" (UID: \"d1581831-b357-4fe6-8e8b-5bdbc057f867\") " pod="openshift-marketplace/certified-operators-n5224" Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.748135 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p9hf8"] Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.776923 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:56 crc kubenswrapper[4722]: E1002 09:34:56.778645 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:57.278598221 +0000 UTC m=+153.315351201 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.778697 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:56 crc kubenswrapper[4722]: E1002 09:34:56.779140 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:57.279129255 +0000 UTC m=+153.315882235 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.793065 4722 patch_prober.go:28] interesting pod/router-default-5444994796-6xfhq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 09:34:56 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Oct 02 09:34:56 crc kubenswrapper[4722]: [+]process-running ok Oct 02 09:34:56 crc kubenswrapper[4722]: healthz check failed Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.793153 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xfhq" podUID="fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 09:34:56 crc kubenswrapper[4722]: W1002 09:34:56.795179 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e7f1ec0_f82a_4911_b120_ba1c25a4a200.slice/crio-bf2d679f68c08b0ee3c360eef3f842e8805de28eee48911b32b4dbc84e6b1af1 WatchSource:0}: Error finding container bf2d679f68c08b0ee3c360eef3f842e8805de28eee48911b32b4dbc84e6b1af1: Status 404 returned error can't find the container with id bf2d679f68c08b0ee3c360eef3f842e8805de28eee48911b32b4dbc84e6b1af1 Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.826639 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n5224" Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.869189 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8zdkg" Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.883023 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:56 crc kubenswrapper[4722]: E1002 09:34:56.902359 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:57.402327012 +0000 UTC m=+153.439079992 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:56 crc kubenswrapper[4722]: I1002 09:34:56.986607 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:56 crc kubenswrapper[4722]: E1002 09:34:56.986917 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:57.486904137 +0000 UTC m=+153.523657117 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:57 crc kubenswrapper[4722]: I1002 09:34:57.018347 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-s727n" Oct 02 09:34:57 crc kubenswrapper[4722]: I1002 09:34:57.038280 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pf6dm" Oct 02 09:34:57 crc kubenswrapper[4722]: I1002 09:34:57.087689 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:57 crc kubenswrapper[4722]: E1002 09:34:57.090764 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:57.590734521 +0000 UTC m=+153.627487681 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:57 crc kubenswrapper[4722]: I1002 09:34:57.126737 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n5224"] Oct 02 09:34:57 crc kubenswrapper[4722]: I1002 09:34:57.180207 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9f2qr"] Oct 02 09:34:57 crc kubenswrapper[4722]: I1002 09:34:57.190210 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:57 crc kubenswrapper[4722]: E1002 09:34:57.190613 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:57.690590992 +0000 UTC m=+153.727344152 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:57 crc kubenswrapper[4722]: I1002 09:34:57.194531 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"8a78776cf4b8b061fde7e657553586c662333d9e65ec4e0482d38e5347504131"} Oct 02 09:34:57 crc kubenswrapper[4722]: I1002 09:34:57.195680 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9dc19c16-fd8f-4ec7-9066-b18aff12d731","Type":"ContainerStarted","Data":"3ad99d4814454f3def5d3abca04cd6c736237fddb81d9afa25d410167a44a8a7"} Oct 02 09:34:57 crc kubenswrapper[4722]: I1002 09:34:57.196682 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n5224" event={"ID":"d1581831-b357-4fe6-8e8b-5bdbc057f867","Type":"ContainerStarted","Data":"30f3395a5b5bffa19d8d4592594ab592221b91e1bc87ba000560e4f3555c48d1"} Oct 02 09:34:57 crc kubenswrapper[4722]: I1002 09:34:57.197584 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p9hf8" event={"ID":"1e7f1ec0-f82a-4911-b120-ba1c25a4a200","Type":"ContainerStarted","Data":"bf2d679f68c08b0ee3c360eef3f842e8805de28eee48911b32b4dbc84e6b1af1"} Oct 02 09:34:57 crc kubenswrapper[4722]: I1002 09:34:57.283572 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8zdkg"] Oct 02 09:34:57 crc kubenswrapper[4722]: I1002 09:34:57.294119 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:57 crc kubenswrapper[4722]: E1002 09:34:57.294567 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:57.79454655 +0000 UTC m=+153.831299530 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:57 crc kubenswrapper[4722]: I1002 09:34:57.346152 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:34:57 crc kubenswrapper[4722]: I1002 09:34:57.397838 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:57 crc kubenswrapper[4722]: E1002 09:34:57.399849 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:57.899826571 +0000 UTC m=+153.936579541 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:57 crc kubenswrapper[4722]: I1002 09:34:57.498729 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:57 crc kubenswrapper[4722]: E1002 09:34:57.498929 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:57.998890602 +0000 UTC m=+154.035643582 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:57 crc kubenswrapper[4722]: I1002 09:34:57.499217 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:57 crc kubenswrapper[4722]: E1002 09:34:57.499852 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:57.999833657 +0000 UTC m=+154.036586637 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:57 crc kubenswrapper[4722]: I1002 09:34:57.504284 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fpttx" Oct 02 09:34:57 crc kubenswrapper[4722]: I1002 09:34:57.504390 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fpttx" Oct 02 09:34:57 crc kubenswrapper[4722]: I1002 09:34:57.515850 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fpttx" Oct 02 09:34:57 crc kubenswrapper[4722]: I1002 09:34:57.600257 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:57 crc kubenswrapper[4722]: E1002 09:34:57.600500 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:58.100464928 +0000 UTC m=+154.137217908 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:57 crc kubenswrapper[4722]: I1002 09:34:57.600850 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:57 crc kubenswrapper[4722]: E1002 09:34:57.601495 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:58.101469174 +0000 UTC m=+154.138222154 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:57 crc kubenswrapper[4722]: I1002 09:34:57.645345 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-clnnl" Oct 02 09:34:57 crc kubenswrapper[4722]: I1002 09:34:57.645396 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-clnnl" Oct 02 09:34:57 crc kubenswrapper[4722]: I1002 09:34:57.648191 4722 patch_prober.go:28] interesting pod/console-f9d7485db-clnnl container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Oct 02 09:34:57 crc kubenswrapper[4722]: I1002 09:34:57.648308 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-clnnl" podUID="6898b032-cd5c-42cf-8ebf-df3fec127c5e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Oct 02 09:34:57 crc kubenswrapper[4722]: I1002 09:34:57.652940 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-mfhl4 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Oct 02 09:34:57 crc kubenswrapper[4722]: I1002 09:34:57.653037 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mfhl4" podUID="7fb5f55e-6d4f-465a-9e93-45a36a2ce4b3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Oct 02 09:34:57 crc kubenswrapper[4722]: I1002 09:34:57.653142 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-mfhl4 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Oct 02 09:34:57 crc kubenswrapper[4722]: I1002 09:34:57.653167 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-mfhl4" podUID="7fb5f55e-6d4f-465a-9e93-45a36a2ce4b3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Oct 02 09:34:57 crc kubenswrapper[4722]: I1002 09:34:57.701429 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:57 crc kubenswrapper[4722]: E1002 09:34:57.701560 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:58.201534471 +0000 UTC m=+154.238287451 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:57 crc kubenswrapper[4722]: I1002 09:34:57.701776 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:57 crc kubenswrapper[4722]: E1002 09:34:57.703193 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:58.203184883 +0000 UTC m=+154.239937863 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:57 crc kubenswrapper[4722]: I1002 09:34:57.718703 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-6xfhq" Oct 02 09:34:57 crc kubenswrapper[4722]: I1002 09:34:57.735561 4722 patch_prober.go:28] interesting pod/router-default-5444994796-6xfhq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 09:34:57 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Oct 02 09:34:57 crc kubenswrapper[4722]: [+]process-running ok Oct 02 09:34:57 crc kubenswrapper[4722]: healthz check failed Oct 02 09:34:57 crc kubenswrapper[4722]: I1002 09:34:57.735654 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xfhq" podUID="fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 09:34:57 crc kubenswrapper[4722]: I1002 09:34:57.786248 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-jkt7j" Oct 02 09:34:57 crc kubenswrapper[4722]: I1002 09:34:57.786672 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-jkt7j" Oct 02 09:34:57 crc kubenswrapper[4722]: I1002 09:34:57.802836 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:57 crc kubenswrapper[4722]: E1002 09:34:57.803287 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:58.30326403 +0000 UTC m=+154.340017010 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:57 crc kubenswrapper[4722]: I1002 09:34:57.803446 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:57 crc kubenswrapper[4722]: E1002 09:34:57.808851 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:58.308820624 +0000 UTC m=+154.345573604 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:57 crc kubenswrapper[4722]: I1002 09:34:57.852557 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-422gm" Oct 02 09:34:57 crc kubenswrapper[4722]: I1002 09:34:57.859043 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-422gm" Oct 02 09:34:57 crc kubenswrapper[4722]: I1002 09:34:57.872230 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-qtsxk" Oct 02 09:34:57 crc kubenswrapper[4722]: I1002 09:34:57.892211 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-zk6t7" Oct 02 09:34:57 crc kubenswrapper[4722]: I1002 09:34:57.904838 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:57 crc kubenswrapper[4722]: E1002 09:34:57.905092 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:58.405053022 +0000 UTC m=+154.441806002 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:57 crc kubenswrapper[4722]: I1002 09:34:57.918609 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5bhmb" Oct 02 09:34:57 crc kubenswrapper[4722]: I1002 09:34:57.934802 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5bhmb" Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.007176 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:58 crc kubenswrapper[4722]: E1002 09:34:58.008231 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:58.508216579 +0000 UTC m=+154.544969559 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.045300 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x8b9l"] Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.046793 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x8b9l" Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.058048 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.069539 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x8b9l"] Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.108335 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.108439 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg9mx\" (UniqueName: \"kubernetes.io/projected/a1c11523-129d-450b-bd1f-b72d6a019660-kube-api-access-tg9mx\") pod \"redhat-marketplace-x8b9l\" (UID: \"a1c11523-129d-450b-bd1f-b72d6a019660\") " pod="openshift-marketplace/redhat-marketplace-x8b9l" Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.108478 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1c11523-129d-450b-bd1f-b72d6a019660-utilities\") pod \"redhat-marketplace-x8b9l\" (UID: \"a1c11523-129d-450b-bd1f-b72d6a019660\") " pod="openshift-marketplace/redhat-marketplace-x8b9l" Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.108506 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1c11523-129d-450b-bd1f-b72d6a019660-catalog-content\") pod \"redhat-marketplace-x8b9l\" (UID: \"a1c11523-129d-450b-bd1f-b72d6a019660\") " pod="openshift-marketplace/redhat-marketplace-x8b9l" Oct 02 09:34:58 crc kubenswrapper[4722]: E1002 09:34:58.108624 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:58.608604654 +0000 UTC m=+154.645357634 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.205039 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"d2f20853f04afd2729a3a3bb35afa52a2c7b036fff0da1d30aea2c18e428d7f4"} Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.206158 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.207492 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9dc19c16-fd8f-4ec7-9066-b18aff12d731","Type":"ContainerStarted","Data":"c397da302882e4da87f073315032affc4b65fd29e074b3ccba52fe5790b05110"} Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.209302 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.209405 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tg9mx\" (UniqueName: \"kubernetes.io/projected/a1c11523-129d-450b-bd1f-b72d6a019660-kube-api-access-tg9mx\") pod \"redhat-marketplace-x8b9l\" (UID: \"a1c11523-129d-450b-bd1f-b72d6a019660\") " pod="openshift-marketplace/redhat-marketplace-x8b9l" Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.209456 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1c11523-129d-450b-bd1f-b72d6a019660-utilities\") pod \"redhat-marketplace-x8b9l\" (UID: \"a1c11523-129d-450b-bd1f-b72d6a019660\") " pod="openshift-marketplace/redhat-marketplace-x8b9l" Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.209498 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1c11523-129d-450b-bd1f-b72d6a019660-catalog-content\") pod \"redhat-marketplace-x8b9l\" (UID: \"a1c11523-129d-450b-bd1f-b72d6a019660\") " pod="openshift-marketplace/redhat-marketplace-x8b9l" Oct 02 09:34:58 crc kubenswrapper[4722]: E1002 09:34:58.209745 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:58.709724887 +0000 UTC m=+154.746477867 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.210299 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1c11523-129d-450b-bd1f-b72d6a019660-utilities\") pod \"redhat-marketplace-x8b9l\" (UID: \"a1c11523-129d-450b-bd1f-b72d6a019660\") " pod="openshift-marketplace/redhat-marketplace-x8b9l" Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.210348 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1c11523-129d-450b-bd1f-b72d6a019660-catalog-content\") pod \"redhat-marketplace-x8b9l\" (UID: \"a1c11523-129d-450b-bd1f-b72d6a019660\") " pod="openshift-marketplace/redhat-marketplace-x8b9l" Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.212868 4722 generic.go:334] "Generic (PLEG): container finished" podID="ea335a73-5af7-4bd2-8439-cfa2304b594e" containerID="f906d18d2c1bdedaa8e9e988a10ecd0dc857be4917160e961e04c83fc9b01e55" exitCode=0 Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.212940 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8zdkg" event={"ID":"ea335a73-5af7-4bd2-8439-cfa2304b594e","Type":"ContainerDied","Data":"f906d18d2c1bdedaa8e9e988a10ecd0dc857be4917160e961e04c83fc9b01e55"} Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.212971 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8zdkg" event={"ID":"ea335a73-5af7-4bd2-8439-cfa2304b594e","Type":"ContainerStarted","Data":"465382d855915037021a53e4c830224d30a576d92c40686fee617f4873c4c5cf"} Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.215078 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.215650 4722 generic.go:334] "Generic (PLEG): container finished" podID="d1581831-b357-4fe6-8e8b-5bdbc057f867" containerID="bae21b5465783ca8558d73e5ad1491c543547988abfa477b2be37e8222c45e2c" exitCode=0 Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.215763 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n5224" event={"ID":"d1581831-b357-4fe6-8e8b-5bdbc057f867","Type":"ContainerDied","Data":"bae21b5465783ca8558d73e5ad1491c543547988abfa477b2be37e8222c45e2c"} Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.221745 4722 generic.go:334] "Generic (PLEG): container finished" podID="1e7f1ec0-f82a-4911-b120-ba1c25a4a200" containerID="861c05cecc7e7be32a4c87f237d5692e69747ac8726e983ad2b7e9496ef91eb0" exitCode=0 Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.221833 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p9hf8" event={"ID":"1e7f1ec0-f82a-4911-b120-ba1c25a4a200","Type":"ContainerDied","Data":"861c05cecc7e7be32a4c87f237d5692e69747ac8726e983ad2b7e9496ef91eb0"} Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.225984 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"2435f4018617b5be25add1b6c2798e6c2d10509984afb84f7db9ddb048cea02e"} Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.245803 4722 generic.go:334] "Generic (PLEG): container finished" podID="5c352a70-fb33-4e66-a950-0e831abd5d45" containerID="a8ab04421e796425247d2fd81d09e955d4da9ec2d47fdbace5106e9705ec8ab0" exitCode=0 Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.247198 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9f2qr" event={"ID":"5c352a70-fb33-4e66-a950-0e831abd5d45","Type":"ContainerDied","Data":"a8ab04421e796425247d2fd81d09e955d4da9ec2d47fdbace5106e9705ec8ab0"} Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.247241 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9f2qr" event={"ID":"5c352a70-fb33-4e66-a950-0e831abd5d45","Type":"ContainerStarted","Data":"7c505a10b19b5150c6458a308837f7b57a39d881993b59ac6635834f65c4d30e"} Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.268398 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg9mx\" (UniqueName: \"kubernetes.io/projected/a1c11523-129d-450b-bd1f-b72d6a019660-kube-api-access-tg9mx\") pod \"redhat-marketplace-x8b9l\" (UID: \"a1c11523-129d-450b-bd1f-b72d6a019660\") " pod="openshift-marketplace/redhat-marketplace-x8b9l" Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.277674 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fpttx" Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.310794 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:58 crc kubenswrapper[4722]: E1002 09:34:58.311178 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:58.81115877 +0000 UTC m=+154.847911750 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.339637 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.339615638 podStartE2EDuration="3.339615638s" podCreationTimestamp="2025-10-02 09:34:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:34:58.301427687 +0000 UTC m=+154.338180667" watchObservedRunningTime="2025-10-02 09:34:58.339615638 +0000 UTC m=+154.376368618" Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.370336 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x8b9l" Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.412881 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:58 crc kubenswrapper[4722]: E1002 09:34:58.416454 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:58.916436291 +0000 UTC m=+154.953189271 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.449706 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vpgk2"] Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.481467 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vpgk2" Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.506377 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vpgk2"] Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.516215 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:58 crc kubenswrapper[4722]: E1002 09:34:58.517065 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:59.016995981 +0000 UTC m=+155.053748951 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.518115 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:58 crc kubenswrapper[4722]: E1002 09:34:58.519080 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:59.019059754 +0000 UTC m=+155.055812734 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.619303 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.619725 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52ea146c-d933-4406-985c-7ec1c9386710-utilities\") pod \"redhat-marketplace-vpgk2\" (UID: \"52ea146c-d933-4406-985c-7ec1c9386710\") " pod="openshift-marketplace/redhat-marketplace-vpgk2" Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.619765 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52ea146c-d933-4406-985c-7ec1c9386710-catalog-content\") pod \"redhat-marketplace-vpgk2\" (UID: \"52ea146c-d933-4406-985c-7ec1c9386710\") " pod="openshift-marketplace/redhat-marketplace-vpgk2" Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.619791 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kgkb\" (UniqueName: \"kubernetes.io/projected/52ea146c-d933-4406-985c-7ec1c9386710-kube-api-access-4kgkb\") pod \"redhat-marketplace-vpgk2\" (UID: \"52ea146c-d933-4406-985c-7ec1c9386710\") " pod="openshift-marketplace/redhat-marketplace-vpgk2" Oct 02 09:34:58 crc kubenswrapper[4722]: E1002 09:34:58.619909 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:59.119891611 +0000 UTC m=+155.156644591 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.721212 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.721271 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52ea146c-d933-4406-985c-7ec1c9386710-utilities\") pod \"redhat-marketplace-vpgk2\" (UID: \"52ea146c-d933-4406-985c-7ec1c9386710\") " pod="openshift-marketplace/redhat-marketplace-vpgk2" Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.721309 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52ea146c-d933-4406-985c-7ec1c9386710-catalog-content\") pod \"redhat-marketplace-vpgk2\" (UID: \"52ea146c-d933-4406-985c-7ec1c9386710\") " pod="openshift-marketplace/redhat-marketplace-vpgk2" Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.721352 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kgkb\" (UniqueName: \"kubernetes.io/projected/52ea146c-d933-4406-985c-7ec1c9386710-kube-api-access-4kgkb\") pod \"redhat-marketplace-vpgk2\" (UID: \"52ea146c-d933-4406-985c-7ec1c9386710\") " pod="openshift-marketplace/redhat-marketplace-vpgk2" Oct 02 09:34:58 crc kubenswrapper[4722]: E1002 09:34:58.722037 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:59.222025561 +0000 UTC m=+155.258778541 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.722574 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52ea146c-d933-4406-985c-7ec1c9386710-utilities\") pod \"redhat-marketplace-vpgk2\" (UID: \"52ea146c-d933-4406-985c-7ec1c9386710\") " pod="openshift-marketplace/redhat-marketplace-vpgk2" Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.726567 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52ea146c-d933-4406-985c-7ec1c9386710-catalog-content\") pod \"redhat-marketplace-vpgk2\" (UID: \"52ea146c-d933-4406-985c-7ec1c9386710\") " pod="openshift-marketplace/redhat-marketplace-vpgk2" Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.734185 4722 patch_prober.go:28] interesting pod/router-default-5444994796-6xfhq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 09:34:58 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Oct 02 09:34:58 crc kubenswrapper[4722]: [+]process-running ok Oct 02 09:34:58 crc kubenswrapper[4722]: healthz check failed Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.734258 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xfhq" podUID="fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.778264 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kgkb\" (UniqueName: \"kubernetes.io/projected/52ea146c-d933-4406-985c-7ec1c9386710-kube-api-access-4kgkb\") pod \"redhat-marketplace-vpgk2\" (UID: \"52ea146c-d933-4406-985c-7ec1c9386710\") " pod="openshift-marketplace/redhat-marketplace-vpgk2" Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.824474 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:58 crc kubenswrapper[4722]: E1002 09:34:58.825133 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:59.325107746 +0000 UTC m=+155.361860726 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.827187 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vpgk2" Oct 02 09:34:58 crc kubenswrapper[4722]: E1002 09:34:58.934950 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:59.434926036 +0000 UTC m=+155.471679016 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:58 crc kubenswrapper[4722]: I1002 09:34:58.935217 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.041005 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:59 crc kubenswrapper[4722]: E1002 09:34:59.041433 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:59.541415788 +0000 UTC m=+155.578168768 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.041834 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n2vk2"] Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.042952 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n2vk2" Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.048169 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.062828 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n2vk2"] Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.095473 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x8b9l"] Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.149428 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f9ng\" (UniqueName: \"kubernetes.io/projected/d2ad2457-159c-4841-82fe-ea109055fe28-kube-api-access-4f9ng\") pod \"redhat-operators-n2vk2\" (UID: \"d2ad2457-159c-4841-82fe-ea109055fe28\") " pod="openshift-marketplace/redhat-operators-n2vk2" Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.149515 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2ad2457-159c-4841-82fe-ea109055fe28-utilities\") pod \"redhat-operators-n2vk2\" (UID: \"d2ad2457-159c-4841-82fe-ea109055fe28\") " pod="openshift-marketplace/redhat-operators-n2vk2" Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.149590 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.149905 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2ad2457-159c-4841-82fe-ea109055fe28-catalog-content\") pod \"redhat-operators-n2vk2\" (UID: \"d2ad2457-159c-4841-82fe-ea109055fe28\") " pod="openshift-marketplace/redhat-operators-n2vk2" Oct 02 09:34:59 crc kubenswrapper[4722]: E1002 09:34:59.150197 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:59.65017385 +0000 UTC m=+155.686926830 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.181443 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.197159 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.199760 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.200711 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.202607 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.251066 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:59 crc kubenswrapper[4722]: E1002 09:34:59.251759 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:59.751739776 +0000 UTC m=+155.788492756 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.251860 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2ad2457-159c-4841-82fe-ea109055fe28-catalog-content\") pod \"redhat-operators-n2vk2\" (UID: \"d2ad2457-159c-4841-82fe-ea109055fe28\") " pod="openshift-marketplace/redhat-operators-n2vk2" Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.251918 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f9ng\" (UniqueName: \"kubernetes.io/projected/d2ad2457-159c-4841-82fe-ea109055fe28-kube-api-access-4f9ng\") pod \"redhat-operators-n2vk2\" (UID: \"d2ad2457-159c-4841-82fe-ea109055fe28\") " pod="openshift-marketplace/redhat-operators-n2vk2" Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.251945 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2ad2457-159c-4841-82fe-ea109055fe28-utilities\") pod \"redhat-operators-n2vk2\" (UID: \"d2ad2457-159c-4841-82fe-ea109055fe28\") " pod="openshift-marketplace/redhat-operators-n2vk2" Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.251972 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:59 crc kubenswrapper[4722]: E1002 09:34:59.252306 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:59.75229606 +0000 UTC m=+155.789049040 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.252870 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2ad2457-159c-4841-82fe-ea109055fe28-utilities\") pod \"redhat-operators-n2vk2\" (UID: \"d2ad2457-159c-4841-82fe-ea109055fe28\") " pod="openshift-marketplace/redhat-operators-n2vk2" Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.252929 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2ad2457-159c-4841-82fe-ea109055fe28-catalog-content\") pod \"redhat-operators-n2vk2\" (UID: \"d2ad2457-159c-4841-82fe-ea109055fe28\") " pod="openshift-marketplace/redhat-operators-n2vk2" Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.270721 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-h645w" event={"ID":"2186d01d-5c96-4c50-9414-bce7cd2d7052","Type":"ContainerStarted","Data":"0b7dd679abfb10752424cc4dc76307fa2d96a33b93324db8a74c6e7d16785603"} Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.295246 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f9ng\" (UniqueName: \"kubernetes.io/projected/d2ad2457-159c-4841-82fe-ea109055fe28-kube-api-access-4f9ng\") pod \"redhat-operators-n2vk2\" (UID: \"d2ad2457-159c-4841-82fe-ea109055fe28\") " pod="openshift-marketplace/redhat-operators-n2vk2" Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.329753 4722 generic.go:334] "Generic (PLEG): container finished" podID="9dc19c16-fd8f-4ec7-9066-b18aff12d731" containerID="c397da302882e4da87f073315032affc4b65fd29e074b3ccba52fe5790b05110" exitCode=0 Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.330130 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9dc19c16-fd8f-4ec7-9066-b18aff12d731","Type":"ContainerDied","Data":"c397da302882e4da87f073315032affc4b65fd29e074b3ccba52fe5790b05110"} Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.355721 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.356117 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/74b4c583-6def-4600-8894-261e875642ab-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"74b4c583-6def-4600-8894-261e875642ab\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.356174 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/74b4c583-6def-4600-8894-261e875642ab-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"74b4c583-6def-4600-8894-261e875642ab\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.356865 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x8b9l" event={"ID":"a1c11523-129d-450b-bd1f-b72d6a019660","Type":"ContainerStarted","Data":"536e584e52a908b96f24fdebe628578be9771395a770e786eff418d04a02c2f3"} Oct 02 09:34:59 crc kubenswrapper[4722]: E1002 09:34:59.357502 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:34:59.857463369 +0000 UTC m=+155.894216539 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.406954 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n2vk2" Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.439771 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xl585"] Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.441004 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xl585" Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.457254 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.457386 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/74b4c583-6def-4600-8894-261e875642ab-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"74b4c583-6def-4600-8894-261e875642ab\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.457448 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/74b4c583-6def-4600-8894-261e875642ab-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"74b4c583-6def-4600-8894-261e875642ab\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.457583 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/74b4c583-6def-4600-8894-261e875642ab-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"74b4c583-6def-4600-8894-261e875642ab\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 09:34:59 crc kubenswrapper[4722]: E1002 09:34:59.458268 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:34:59.958240044 +0000 UTC m=+155.994993024 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.458994 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xl585"] Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.507452 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/74b4c583-6def-4600-8894-261e875642ab-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"74b4c583-6def-4600-8894-261e875642ab\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.507698 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vpgk2"] Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.530813 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.558659 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.558990 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/463ed397-56f9-4b1b-b32b-4d2204f7632a-utilities\") pod \"redhat-operators-xl585\" (UID: \"463ed397-56f9-4b1b-b32b-4d2204f7632a\") " pod="openshift-marketplace/redhat-operators-xl585" Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.559038 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/463ed397-56f9-4b1b-b32b-4d2204f7632a-catalog-content\") pod \"redhat-operators-xl585\" (UID: \"463ed397-56f9-4b1b-b32b-4d2204f7632a\") " pod="openshift-marketplace/redhat-operators-xl585" Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.559169 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5cff\" (UniqueName: \"kubernetes.io/projected/463ed397-56f9-4b1b-b32b-4d2204f7632a-kube-api-access-h5cff\") pod \"redhat-operators-xl585\" (UID: \"463ed397-56f9-4b1b-b32b-4d2204f7632a\") " pod="openshift-marketplace/redhat-operators-xl585" Oct 02 09:34:59 crc kubenswrapper[4722]: E1002 09:34:59.560773 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:35:00.060740924 +0000 UTC m=+156.097493904 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.662514 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5cff\" (UniqueName: \"kubernetes.io/projected/463ed397-56f9-4b1b-b32b-4d2204f7632a-kube-api-access-h5cff\") pod \"redhat-operators-xl585\" (UID: \"463ed397-56f9-4b1b-b32b-4d2204f7632a\") " pod="openshift-marketplace/redhat-operators-xl585" Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.662604 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.662630 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/463ed397-56f9-4b1b-b32b-4d2204f7632a-utilities\") pod \"redhat-operators-xl585\" (UID: \"463ed397-56f9-4b1b-b32b-4d2204f7632a\") " pod="openshift-marketplace/redhat-operators-xl585" Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.662667 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/463ed397-56f9-4b1b-b32b-4d2204f7632a-catalog-content\") pod \"redhat-operators-xl585\" (UID: \"463ed397-56f9-4b1b-b32b-4d2204f7632a\") " pod="openshift-marketplace/redhat-operators-xl585" Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.663296 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/463ed397-56f9-4b1b-b32b-4d2204f7632a-catalog-content\") pod \"redhat-operators-xl585\" (UID: \"463ed397-56f9-4b1b-b32b-4d2204f7632a\") " pod="openshift-marketplace/redhat-operators-xl585" Oct 02 09:34:59 crc kubenswrapper[4722]: E1002 09:34:59.663881 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:35:00.16386181 +0000 UTC m=+156.200614790 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.664254 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/463ed397-56f9-4b1b-b32b-4d2204f7632a-utilities\") pod \"redhat-operators-xl585\" (UID: \"463ed397-56f9-4b1b-b32b-4d2204f7632a\") " pod="openshift-marketplace/redhat-operators-xl585" Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.692470 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5cff\" (UniqueName: \"kubernetes.io/projected/463ed397-56f9-4b1b-b32b-4d2204f7632a-kube-api-access-h5cff\") pod \"redhat-operators-xl585\" (UID: \"463ed397-56f9-4b1b-b32b-4d2204f7632a\") " pod="openshift-marketplace/redhat-operators-xl585" Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.731477 4722 patch_prober.go:28] interesting pod/router-default-5444994796-6xfhq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 09:34:59 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Oct 02 09:34:59 crc kubenswrapper[4722]: [+]process-running ok Oct 02 09:34:59 crc kubenswrapper[4722]: healthz check failed Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.731548 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xfhq" podUID="fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.743115 4722 patch_prober.go:28] interesting pod/apiserver-76f77b778f-jkt7j container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 02 09:34:59 crc kubenswrapper[4722]: [+]log ok Oct 02 09:34:59 crc kubenswrapper[4722]: [+]etcd ok Oct 02 09:34:59 crc kubenswrapper[4722]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 02 09:34:59 crc kubenswrapper[4722]: [+]poststarthook/generic-apiserver-start-informers ok Oct 02 09:34:59 crc kubenswrapper[4722]: [+]poststarthook/max-in-flight-filter ok Oct 02 09:34:59 crc kubenswrapper[4722]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 02 09:34:59 crc kubenswrapper[4722]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 02 09:34:59 crc kubenswrapper[4722]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Oct 02 09:34:59 crc kubenswrapper[4722]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Oct 02 09:34:59 crc kubenswrapper[4722]: [+]poststarthook/project.openshift.io-projectcache ok Oct 02 09:34:59 crc kubenswrapper[4722]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 02 09:34:59 crc kubenswrapper[4722]: [+]poststarthook/openshift.io-startinformers ok Oct 02 09:34:59 crc kubenswrapper[4722]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 02 09:34:59 crc kubenswrapper[4722]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 02 09:34:59 crc kubenswrapper[4722]: livez check failed Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.743174 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-jkt7j" podUID="722a2a42-6f33-4002-ab72-3544df821d3b" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.766427 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:59 crc kubenswrapper[4722]: E1002 09:34:59.767272 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:35:00.267247903 +0000 UTC m=+156.304000883 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.786617 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xl585" Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.872398 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:34:59 crc kubenswrapper[4722]: E1002 09:34:59.872855 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:35:00.372838222 +0000 UTC m=+156.409591202 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.890332 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n2vk2"] Oct 02 09:34:59 crc kubenswrapper[4722]: W1002 09:34:59.900545 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2ad2457_159c_4841_82fe_ea109055fe28.slice/crio-b9bd2f9b9942cb8b1359c5a8a6ddd457869d8176a0b690be9b22e0ca4e023d61 WatchSource:0}: Error finding container b9bd2f9b9942cb8b1359c5a8a6ddd457869d8176a0b690be9b22e0ca4e023d61: Status 404 returned error can't find the container with id b9bd2f9b9942cb8b1359c5a8a6ddd457869d8176a0b690be9b22e0ca4e023d61 Oct 02 09:34:59 crc kubenswrapper[4722]: I1002 09:34:59.986362 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:34:59 crc kubenswrapper[4722]: E1002 09:34:59.986813 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:35:00.486775449 +0000 UTC m=+156.523528429 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:35:00 crc kubenswrapper[4722]: I1002 09:35:00.031163 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 02 09:35:00 crc kubenswrapper[4722]: W1002 09:35:00.067436 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod74b4c583_6def_4600_8894_261e875642ab.slice/crio-3093dafff3b4b7a42cb884f9e8a4ac49f4e26f09b4c9d990b1860d191876e462 WatchSource:0}: Error finding container 3093dafff3b4b7a42cb884f9e8a4ac49f4e26f09b4c9d990b1860d191876e462: Status 404 returned error can't find the container with id 3093dafff3b4b7a42cb884f9e8a4ac49f4e26f09b4c9d990b1860d191876e462 Oct 02 09:35:00 crc kubenswrapper[4722]: I1002 09:35:00.088306 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:35:00 crc kubenswrapper[4722]: E1002 09:35:00.088732 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:35:00.588713034 +0000 UTC m=+156.625466014 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:35:00 crc kubenswrapper[4722]: I1002 09:35:00.093021 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xl585"] Oct 02 09:35:00 crc kubenswrapper[4722]: W1002 09:35:00.104673 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod463ed397_56f9_4b1b_b32b_4d2204f7632a.slice/crio-f82b764fba18e946edde77a4f7f9e4c7790f2dcf078b7649dd6c90393bad9cd9 WatchSource:0}: Error finding container f82b764fba18e946edde77a4f7f9e4c7790f2dcf078b7649dd6c90393bad9cd9: Status 404 returned error can't find the container with id f82b764fba18e946edde77a4f7f9e4c7790f2dcf078b7649dd6c90393bad9cd9 Oct 02 09:35:00 crc kubenswrapper[4722]: I1002 09:35:00.189248 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:35:00 crc kubenswrapper[4722]: E1002 09:35:00.189933 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:35:00.68990897 +0000 UTC m=+156.726661950 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:35:00 crc kubenswrapper[4722]: I1002 09:35:00.291658 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:35:00 crc kubenswrapper[4722]: E1002 09:35:00.292779 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:35:00.792732978 +0000 UTC m=+156.829485968 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:35:00 crc kubenswrapper[4722]: I1002 09:35:00.365416 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2vk2" event={"ID":"d2ad2457-159c-4841-82fe-ea109055fe28","Type":"ContainerStarted","Data":"b9bd2f9b9942cb8b1359c5a8a6ddd457869d8176a0b690be9b22e0ca4e023d61"} Oct 02 09:35:00 crc kubenswrapper[4722]: I1002 09:35:00.368075 4722 generic.go:334] "Generic (PLEG): container finished" podID="a1c11523-129d-450b-bd1f-b72d6a019660" containerID="117e8feae45f8b255f2bcbfdf3dcb5ac41938decb2108d76a7d90fd228c57271" exitCode=0 Oct 02 09:35:00 crc kubenswrapper[4722]: I1002 09:35:00.368194 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x8b9l" event={"ID":"a1c11523-129d-450b-bd1f-b72d6a019660","Type":"ContainerDied","Data":"117e8feae45f8b255f2bcbfdf3dcb5ac41938decb2108d76a7d90fd228c57271"} Oct 02 09:35:00 crc kubenswrapper[4722]: I1002 09:35:00.370232 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vpgk2" event={"ID":"52ea146c-d933-4406-985c-7ec1c9386710","Type":"ContainerStarted","Data":"b764963c98d4108da8b9aeb11bb405ec97f136abfc3a1d8d9751097e7fd9e2ae"} Oct 02 09:35:00 crc kubenswrapper[4722]: I1002 09:35:00.380808 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-h645w" event={"ID":"2186d01d-5c96-4c50-9414-bce7cd2d7052","Type":"ContainerStarted","Data":"3a911f250b38f2328ead186829e2a9c4bacd979234b4f0b7444256ad92bccfbf"} Oct 02 09:35:00 crc kubenswrapper[4722]: I1002 09:35:00.382829 4722 generic.go:334] "Generic (PLEG): container finished" podID="d30e5a09-5492-426b-89a9-2331cce3cc87" containerID="832518e0d992589482b66314149f1878d43878e531b6241ec07f251280a5d918" exitCode=0 Oct 02 09:35:00 crc kubenswrapper[4722]: I1002 09:35:00.382941 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323290-fh7nk" event={"ID":"d30e5a09-5492-426b-89a9-2331cce3cc87","Type":"ContainerDied","Data":"832518e0d992589482b66314149f1878d43878e531b6241ec07f251280a5d918"} Oct 02 09:35:00 crc kubenswrapper[4722]: I1002 09:35:00.384160 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"74b4c583-6def-4600-8894-261e875642ab","Type":"ContainerStarted","Data":"3093dafff3b4b7a42cb884f9e8a4ac49f4e26f09b4c9d990b1860d191876e462"} Oct 02 09:35:00 crc kubenswrapper[4722]: I1002 09:35:00.385687 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xl585" event={"ID":"463ed397-56f9-4b1b-b32b-4d2204f7632a","Type":"ContainerStarted","Data":"f82b764fba18e946edde77a4f7f9e4c7790f2dcf078b7649dd6c90393bad9cd9"} Oct 02 09:35:00 crc kubenswrapper[4722]: I1002 09:35:00.393501 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:35:00 crc kubenswrapper[4722]: E1002 09:35:00.394282 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:35:00.894241972 +0000 UTC m=+156.930994982 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:35:00 crc kubenswrapper[4722]: I1002 09:35:00.394555 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:35:00 crc kubenswrapper[4722]: E1002 09:35:00.395278 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:35:00.895250278 +0000 UTC m=+156.932003438 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:35:00 crc kubenswrapper[4722]: I1002 09:35:00.496540 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:35:00 crc kubenswrapper[4722]: E1002 09:35:00.496771 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:35:00.996734672 +0000 UTC m=+157.033487652 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:35:00 crc kubenswrapper[4722]: I1002 09:35:00.496851 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:35:00 crc kubenswrapper[4722]: E1002 09:35:00.497273 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:35:00.997255635 +0000 UTC m=+157.034008615 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:35:00 crc kubenswrapper[4722]: I1002 09:35:00.598261 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:35:00 crc kubenswrapper[4722]: E1002 09:35:00.598517 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:35:01.098470742 +0000 UTC m=+157.135223722 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:35:00 crc kubenswrapper[4722]: I1002 09:35:00.598602 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:35:00 crc kubenswrapper[4722]: E1002 09:35:00.599130 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:35:01.099110398 +0000 UTC m=+157.135863378 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:35:00 crc kubenswrapper[4722]: I1002 09:35:00.600597 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 09:35:00 crc kubenswrapper[4722]: I1002 09:35:00.701772 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:35:00 crc kubenswrapper[4722]: I1002 09:35:00.701902 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9dc19c16-fd8f-4ec7-9066-b18aff12d731-kubelet-dir\") pod \"9dc19c16-fd8f-4ec7-9066-b18aff12d731\" (UID: \"9dc19c16-fd8f-4ec7-9066-b18aff12d731\") " Oct 02 09:35:00 crc kubenswrapper[4722]: E1002 09:35:00.702089 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:35:01.20206021 +0000 UTC m=+157.238813190 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:35:00 crc kubenswrapper[4722]: I1002 09:35:00.702167 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9dc19c16-fd8f-4ec7-9066-b18aff12d731-kube-api-access\") pod \"9dc19c16-fd8f-4ec7-9066-b18aff12d731\" (UID: \"9dc19c16-fd8f-4ec7-9066-b18aff12d731\") " Oct 02 09:35:00 crc kubenswrapper[4722]: I1002 09:35:00.702188 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9dc19c16-fd8f-4ec7-9066-b18aff12d731-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9dc19c16-fd8f-4ec7-9066-b18aff12d731" (UID: "9dc19c16-fd8f-4ec7-9066-b18aff12d731"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 09:35:00 crc kubenswrapper[4722]: I1002 09:35:00.702570 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:35:00 crc kubenswrapper[4722]: I1002 09:35:00.702660 4722 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9dc19c16-fd8f-4ec7-9066-b18aff12d731-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 02 09:35:00 crc kubenswrapper[4722]: E1002 09:35:00.703044 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:35:01.203032925 +0000 UTC m=+157.239785905 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:35:00 crc kubenswrapper[4722]: I1002 09:35:00.713596 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dc19c16-fd8f-4ec7-9066-b18aff12d731-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9dc19c16-fd8f-4ec7-9066-b18aff12d731" (UID: "9dc19c16-fd8f-4ec7-9066-b18aff12d731"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:35:00 crc kubenswrapper[4722]: I1002 09:35:00.722818 4722 patch_prober.go:28] interesting pod/router-default-5444994796-6xfhq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 09:35:00 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Oct 02 09:35:00 crc kubenswrapper[4722]: [+]process-running ok Oct 02 09:35:00 crc kubenswrapper[4722]: healthz check failed Oct 02 09:35:00 crc kubenswrapper[4722]: I1002 09:35:00.723078 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xfhq" podUID="fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 09:35:00 crc kubenswrapper[4722]: I1002 09:35:00.748257 4722 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 02 09:35:00 crc kubenswrapper[4722]: I1002 09:35:00.803542 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:35:00 crc kubenswrapper[4722]: I1002 09:35:00.803832 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9dc19c16-fd8f-4ec7-9066-b18aff12d731-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 09:35:00 crc kubenswrapper[4722]: E1002 09:35:00.803907 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 09:35:01.303889552 +0000 UTC m=+157.340642532 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:35:00 crc kubenswrapper[4722]: I1002 09:35:00.904674 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:35:00 crc kubenswrapper[4722]: E1002 09:35:00.905204 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 09:35:01.405188781 +0000 UTC m=+157.441941761 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kf5f5" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 09:35:00 crc kubenswrapper[4722]: I1002 09:35:00.988867 4722 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-02T09:35:00.748289619Z","Handler":null,"Name":""} Oct 02 09:35:00 crc kubenswrapper[4722]: I1002 09:35:00.992618 4722 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 02 09:35:00 crc kubenswrapper[4722]: I1002 09:35:00.992649 4722 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 02 09:35:01 crc kubenswrapper[4722]: I1002 09:35:01.005615 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 09:35:01 crc kubenswrapper[4722]: I1002 09:35:01.016264 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 02 09:35:01 crc kubenswrapper[4722]: I1002 09:35:01.112931 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:35:01 crc kubenswrapper[4722]: I1002 09:35:01.116359 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 09:35:01 crc kubenswrapper[4722]: I1002 09:35:01.116405 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:35:01 crc kubenswrapper[4722]: I1002 09:35:01.264398 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kf5f5\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:35:01 crc kubenswrapper[4722]: I1002 09:35:01.420886 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9dc19c16-fd8f-4ec7-9066-b18aff12d731","Type":"ContainerDied","Data":"3ad99d4814454f3def5d3abca04cd6c736237fddb81d9afa25d410167a44a8a7"} Oct 02 09:35:01 crc kubenswrapper[4722]: I1002 09:35:01.420954 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ad99d4814454f3def5d3abca04cd6c736237fddb81d9afa25d410167a44a8a7" Oct 02 09:35:01 crc kubenswrapper[4722]: I1002 09:35:01.427507 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 09:35:01 crc kubenswrapper[4722]: I1002 09:35:01.434629 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2vk2" event={"ID":"d2ad2457-159c-4841-82fe-ea109055fe28","Type":"ContainerStarted","Data":"f9e273f42a8e41743ee6d02bb596c715950a7a5ba9ef749d728a6ff1dbe73928"} Oct 02 09:35:01 crc kubenswrapper[4722]: I1002 09:35:01.443786 4722 generic.go:334] "Generic (PLEG): container finished" podID="52ea146c-d933-4406-985c-7ec1c9386710" containerID="c9ad4e02298759548a4bb4d2836d9a1b8b01b3b23e5609f3d28ce5162b049a62" exitCode=0 Oct 02 09:35:01 crc kubenswrapper[4722]: I1002 09:35:01.443907 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vpgk2" event={"ID":"52ea146c-d933-4406-985c-7ec1c9386710","Type":"ContainerDied","Data":"c9ad4e02298759548a4bb4d2836d9a1b8b01b3b23e5609f3d28ce5162b049a62"} Oct 02 09:35:01 crc kubenswrapper[4722]: I1002 09:35:01.457343 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-h645w" event={"ID":"2186d01d-5c96-4c50-9414-bce7cd2d7052","Type":"ContainerStarted","Data":"005d1106364d2d2eb90c368088276a6d5f8ae50638d4441aa8c4c910b4a060fa"} Oct 02 09:35:01 crc kubenswrapper[4722]: I1002 09:35:01.532420 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:35:01 crc kubenswrapper[4722]: I1002 09:35:01.706802 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323290-fh7nk" Oct 02 09:35:01 crc kubenswrapper[4722]: I1002 09:35:01.720859 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d30e5a09-5492-426b-89a9-2331cce3cc87-config-volume\") pod \"d30e5a09-5492-426b-89a9-2331cce3cc87\" (UID: \"d30e5a09-5492-426b-89a9-2331cce3cc87\") " Oct 02 09:35:01 crc kubenswrapper[4722]: I1002 09:35:01.720956 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d30e5a09-5492-426b-89a9-2331cce3cc87-secret-volume\") pod \"d30e5a09-5492-426b-89a9-2331cce3cc87\" (UID: \"d30e5a09-5492-426b-89a9-2331cce3cc87\") " Oct 02 09:35:01 crc kubenswrapper[4722]: I1002 09:35:01.721059 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mk8bv\" (UniqueName: \"kubernetes.io/projected/d30e5a09-5492-426b-89a9-2331cce3cc87-kube-api-access-mk8bv\") pod \"d30e5a09-5492-426b-89a9-2331cce3cc87\" (UID: \"d30e5a09-5492-426b-89a9-2331cce3cc87\") " Oct 02 09:35:01 crc kubenswrapper[4722]: I1002 09:35:01.721883 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d30e5a09-5492-426b-89a9-2331cce3cc87-config-volume" (OuterVolumeSpecName: "config-volume") pod "d30e5a09-5492-426b-89a9-2331cce3cc87" (UID: "d30e5a09-5492-426b-89a9-2331cce3cc87"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:35:01 crc kubenswrapper[4722]: I1002 09:35:01.725242 4722 patch_prober.go:28] interesting pod/router-default-5444994796-6xfhq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 09:35:01 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Oct 02 09:35:01 crc kubenswrapper[4722]: [+]process-running ok Oct 02 09:35:01 crc kubenswrapper[4722]: healthz check failed Oct 02 09:35:01 crc kubenswrapper[4722]: I1002 09:35:01.725307 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xfhq" podUID="fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 09:35:01 crc kubenswrapper[4722]: I1002 09:35:01.731938 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d30e5a09-5492-426b-89a9-2331cce3cc87-kube-api-access-mk8bv" (OuterVolumeSpecName: "kube-api-access-mk8bv") pod "d30e5a09-5492-426b-89a9-2331cce3cc87" (UID: "d30e5a09-5492-426b-89a9-2331cce3cc87"). InnerVolumeSpecName "kube-api-access-mk8bv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:35:01 crc kubenswrapper[4722]: I1002 09:35:01.747699 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d30e5a09-5492-426b-89a9-2331cce3cc87-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d30e5a09-5492-426b-89a9-2331cce3cc87" (UID: "d30e5a09-5492-426b-89a9-2331cce3cc87"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:35:01 crc kubenswrapper[4722]: I1002 09:35:01.822293 4722 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d30e5a09-5492-426b-89a9-2331cce3cc87-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 09:35:01 crc kubenswrapper[4722]: I1002 09:35:01.822570 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mk8bv\" (UniqueName: \"kubernetes.io/projected/d30e5a09-5492-426b-89a9-2331cce3cc87-kube-api-access-mk8bv\") on node \"crc\" DevicePath \"\"" Oct 02 09:35:01 crc kubenswrapper[4722]: I1002 09:35:01.822581 4722 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d30e5a09-5492-426b-89a9-2331cce3cc87-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 09:35:02 crc kubenswrapper[4722]: I1002 09:35:02.055240 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kf5f5"] Oct 02 09:35:02 crc kubenswrapper[4722]: I1002 09:35:02.466999 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323290-fh7nk" Oct 02 09:35:02 crc kubenswrapper[4722]: I1002 09:35:02.466985 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323290-fh7nk" event={"ID":"d30e5a09-5492-426b-89a9-2331cce3cc87","Type":"ContainerDied","Data":"af9e2cbe63a20dcb7059986ca2ed21ddccd901f1a391ac997340b63e1edfd8a0"} Oct 02 09:35:02 crc kubenswrapper[4722]: I1002 09:35:02.467539 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af9e2cbe63a20dcb7059986ca2ed21ddccd901f1a391ac997340b63e1edfd8a0" Oct 02 09:35:02 crc kubenswrapper[4722]: I1002 09:35:02.469537 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"74b4c583-6def-4600-8894-261e875642ab","Type":"ContainerStarted","Data":"871665cefa8ec1fbed7c1fa7b1e60a81e6fb5eddece82aed06cec218ff300429"} Oct 02 09:35:02 crc kubenswrapper[4722]: I1002 09:35:02.470900 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" event={"ID":"160f98c9-f6b0-4d80-ab97-37487b03a785","Type":"ContainerStarted","Data":"44e20f388432c651ccc007c3ad61ab6bb7dea7e0ce6a6a4c36ea5fb7b5a0d365"} Oct 02 09:35:02 crc kubenswrapper[4722]: I1002 09:35:02.472785 4722 generic.go:334] "Generic (PLEG): container finished" podID="463ed397-56f9-4b1b-b32b-4d2204f7632a" containerID="46b48a63d837cf2e8b10b0538a76c4a76f6aa8043992352de619fa26969077d7" exitCode=0 Oct 02 09:35:02 crc kubenswrapper[4722]: I1002 09:35:02.472837 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xl585" event={"ID":"463ed397-56f9-4b1b-b32b-4d2204f7632a","Type":"ContainerDied","Data":"46b48a63d837cf2e8b10b0538a76c4a76f6aa8043992352de619fa26969077d7"} Oct 02 09:35:02 crc kubenswrapper[4722]: I1002 09:35:02.474029 4722 generic.go:334] "Generic (PLEG): container finished" podID="d2ad2457-159c-4841-82fe-ea109055fe28" containerID="f9e273f42a8e41743ee6d02bb596c715950a7a5ba9ef749d728a6ff1dbe73928" exitCode=0 Oct 02 09:35:02 crc kubenswrapper[4722]: I1002 09:35:02.474976 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2vk2" event={"ID":"d2ad2457-159c-4841-82fe-ea109055fe28","Type":"ContainerDied","Data":"f9e273f42a8e41743ee6d02bb596c715950a7a5ba9ef749d728a6ff1dbe73928"} Oct 02 09:35:02 crc kubenswrapper[4722]: I1002 09:35:02.549587 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-h645w" podStartSLOduration=17.54954168 podStartE2EDuration="17.54954168s" podCreationTimestamp="2025-10-02 09:34:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:35:02.543698078 +0000 UTC m=+158.580451078" watchObservedRunningTime="2025-10-02 09:35:02.54954168 +0000 UTC m=+158.586294660" Oct 02 09:35:02 crc kubenswrapper[4722]: I1002 09:35:02.721237 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 02 09:35:02 crc kubenswrapper[4722]: I1002 09:35:02.726927 4722 patch_prober.go:28] interesting pod/router-default-5444994796-6xfhq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 09:35:02 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Oct 02 09:35:02 crc kubenswrapper[4722]: [+]process-running ok Oct 02 09:35:02 crc kubenswrapper[4722]: healthz check failed Oct 02 09:35:02 crc kubenswrapper[4722]: I1002 09:35:02.727056 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xfhq" podUID="fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 09:35:02 crc kubenswrapper[4722]: I1002 09:35:02.791979 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-jkt7j" Oct 02 09:35:02 crc kubenswrapper[4722]: I1002 09:35:02.797071 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-jkt7j" Oct 02 09:35:03 crc kubenswrapper[4722]: I1002 09:35:03.274011 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-hwbdb" Oct 02 09:35:03 crc kubenswrapper[4722]: I1002 09:35:03.527904 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" event={"ID":"160f98c9-f6b0-4d80-ab97-37487b03a785","Type":"ContainerStarted","Data":"11078d8f98e664e7e2db2a9ac1dcbf641e653b5db9e80e09c6980e12d7a41b27"} Oct 02 09:35:03 crc kubenswrapper[4722]: I1002 09:35:03.528002 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:35:03 crc kubenswrapper[4722]: I1002 09:35:03.533144 4722 generic.go:334] "Generic (PLEG): container finished" podID="74b4c583-6def-4600-8894-261e875642ab" containerID="871665cefa8ec1fbed7c1fa7b1e60a81e6fb5eddece82aed06cec218ff300429" exitCode=0 Oct 02 09:35:03 crc kubenswrapper[4722]: I1002 09:35:03.533225 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"74b4c583-6def-4600-8894-261e875642ab","Type":"ContainerDied","Data":"871665cefa8ec1fbed7c1fa7b1e60a81e6fb5eddece82aed06cec218ff300429"} Oct 02 09:35:03 crc kubenswrapper[4722]: I1002 09:35:03.551545 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" podStartSLOduration=136.551523649 podStartE2EDuration="2m16.551523649s" podCreationTimestamp="2025-10-02 09:32:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:35:03.551278633 +0000 UTC m=+159.588031623" watchObservedRunningTime="2025-10-02 09:35:03.551523649 +0000 UTC m=+159.588276629" Oct 02 09:35:03 crc kubenswrapper[4722]: I1002 09:35:03.724205 4722 patch_prober.go:28] interesting pod/router-default-5444994796-6xfhq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 09:35:03 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Oct 02 09:35:03 crc kubenswrapper[4722]: [+]process-running ok Oct 02 09:35:03 crc kubenswrapper[4722]: healthz check failed Oct 02 09:35:03 crc kubenswrapper[4722]: I1002 09:35:03.724328 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xfhq" podUID="fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 09:35:04 crc kubenswrapper[4722]: I1002 09:35:04.722716 4722 patch_prober.go:28] interesting pod/router-default-5444994796-6xfhq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 09:35:04 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Oct 02 09:35:04 crc kubenswrapper[4722]: [+]process-running ok Oct 02 09:35:04 crc kubenswrapper[4722]: healthz check failed Oct 02 09:35:04 crc kubenswrapper[4722]: I1002 09:35:04.723354 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xfhq" podUID="fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 09:35:05 crc kubenswrapper[4722]: I1002 09:35:05.721095 4722 patch_prober.go:28] interesting pod/router-default-5444994796-6xfhq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 09:35:05 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Oct 02 09:35:05 crc kubenswrapper[4722]: [+]process-running ok Oct 02 09:35:05 crc kubenswrapper[4722]: healthz check failed Oct 02 09:35:05 crc kubenswrapper[4722]: I1002 09:35:05.721387 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xfhq" podUID="fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 09:35:06 crc kubenswrapper[4722]: I1002 09:35:06.721571 4722 patch_prober.go:28] interesting pod/router-default-5444994796-6xfhq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 09:35:06 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Oct 02 09:35:06 crc kubenswrapper[4722]: [+]process-running ok Oct 02 09:35:06 crc kubenswrapper[4722]: healthz check failed Oct 02 09:35:06 crc kubenswrapper[4722]: I1002 09:35:06.721646 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xfhq" podUID="fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 09:35:07 crc kubenswrapper[4722]: I1002 09:35:07.645828 4722 patch_prober.go:28] interesting pod/console-f9d7485db-clnnl container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Oct 02 09:35:07 crc kubenswrapper[4722]: I1002 09:35:07.645901 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-clnnl" podUID="6898b032-cd5c-42cf-8ebf-df3fec127c5e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Oct 02 09:35:07 crc kubenswrapper[4722]: I1002 09:35:07.653046 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-mfhl4 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Oct 02 09:35:07 crc kubenswrapper[4722]: I1002 09:35:07.653072 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-mfhl4 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Oct 02 09:35:07 crc kubenswrapper[4722]: I1002 09:35:07.653115 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-mfhl4" podUID="7fb5f55e-6d4f-465a-9e93-45a36a2ce4b3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Oct 02 09:35:07 crc kubenswrapper[4722]: I1002 09:35:07.653117 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mfhl4" podUID="7fb5f55e-6d4f-465a-9e93-45a36a2ce4b3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Oct 02 09:35:07 crc kubenswrapper[4722]: I1002 09:35:07.722019 4722 patch_prober.go:28] interesting pod/router-default-5444994796-6xfhq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 09:35:07 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Oct 02 09:35:07 crc kubenswrapper[4722]: [+]process-running ok Oct 02 09:35:07 crc kubenswrapper[4722]: healthz check failed Oct 02 09:35:07 crc kubenswrapper[4722]: I1002 09:35:07.722103 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xfhq" podUID="fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 09:35:08 crc kubenswrapper[4722]: I1002 09:35:08.722533 4722 patch_prober.go:28] interesting pod/router-default-5444994796-6xfhq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 09:35:08 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Oct 02 09:35:08 crc kubenswrapper[4722]: [+]process-running ok Oct 02 09:35:08 crc kubenswrapper[4722]: healthz check failed Oct 02 09:35:08 crc kubenswrapper[4722]: I1002 09:35:08.722614 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xfhq" podUID="fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 09:35:09 crc kubenswrapper[4722]: I1002 09:35:09.093559 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 09:35:09 crc kubenswrapper[4722]: I1002 09:35:09.273294 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/74b4c583-6def-4600-8894-261e875642ab-kubelet-dir\") pod \"74b4c583-6def-4600-8894-261e875642ab\" (UID: \"74b4c583-6def-4600-8894-261e875642ab\") " Oct 02 09:35:09 crc kubenswrapper[4722]: I1002 09:35:09.273376 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74b4c583-6def-4600-8894-261e875642ab-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "74b4c583-6def-4600-8894-261e875642ab" (UID: "74b4c583-6def-4600-8894-261e875642ab"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 09:35:09 crc kubenswrapper[4722]: I1002 09:35:09.273465 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/74b4c583-6def-4600-8894-261e875642ab-kube-api-access\") pod \"74b4c583-6def-4600-8894-261e875642ab\" (UID: \"74b4c583-6def-4600-8894-261e875642ab\") " Oct 02 09:35:09 crc kubenswrapper[4722]: I1002 09:35:09.273718 4722 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/74b4c583-6def-4600-8894-261e875642ab-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 02 09:35:09 crc kubenswrapper[4722]: I1002 09:35:09.281301 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74b4c583-6def-4600-8894-261e875642ab-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "74b4c583-6def-4600-8894-261e875642ab" (UID: "74b4c583-6def-4600-8894-261e875642ab"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:35:09 crc kubenswrapper[4722]: I1002 09:35:09.374910 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/74b4c583-6def-4600-8894-261e875642ab-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 09:35:09 crc kubenswrapper[4722]: I1002 09:35:09.611872 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"74b4c583-6def-4600-8894-261e875642ab","Type":"ContainerDied","Data":"3093dafff3b4b7a42cb884f9e8a4ac49f4e26f09b4c9d990b1860d191876e462"} Oct 02 09:35:09 crc kubenswrapper[4722]: I1002 09:35:09.611943 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3093dafff3b4b7a42cb884f9e8a4ac49f4e26f09b4c9d990b1860d191876e462" Oct 02 09:35:09 crc kubenswrapper[4722]: I1002 09:35:09.611908 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 09:35:09 crc kubenswrapper[4722]: I1002 09:35:09.723632 4722 patch_prober.go:28] interesting pod/router-default-5444994796-6xfhq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 09:35:09 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Oct 02 09:35:09 crc kubenswrapper[4722]: [+]process-running ok Oct 02 09:35:09 crc kubenswrapper[4722]: healthz check failed Oct 02 09:35:09 crc kubenswrapper[4722]: I1002 09:35:09.723706 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xfhq" podUID="fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 09:35:10 crc kubenswrapper[4722]: I1002 09:35:10.289351 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3d43b03f-1f3f-4824-9eda-93e405f6ee9e-metrics-certs\") pod \"network-metrics-daemon-jn9jw\" (UID: \"3d43b03f-1f3f-4824-9eda-93e405f6ee9e\") " pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:35:10 crc kubenswrapper[4722]: I1002 09:35:10.293253 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3d43b03f-1f3f-4824-9eda-93e405f6ee9e-metrics-certs\") pod \"network-metrics-daemon-jn9jw\" (UID: \"3d43b03f-1f3f-4824-9eda-93e405f6ee9e\") " pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:35:10 crc kubenswrapper[4722]: I1002 09:35:10.524091 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9jw" Oct 02 09:35:10 crc kubenswrapper[4722]: I1002 09:35:10.697384 4722 patch_prober.go:28] interesting pod/machine-config-daemon-q6h76 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 09:35:10 crc kubenswrapper[4722]: I1002 09:35:10.697799 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" podUID="f662826c-e772-4f56-a85f-83971aaef356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 09:35:10 crc kubenswrapper[4722]: I1002 09:35:10.721958 4722 patch_prober.go:28] interesting pod/router-default-5444994796-6xfhq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 09:35:10 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Oct 02 09:35:10 crc kubenswrapper[4722]: [+]process-running ok Oct 02 09:35:10 crc kubenswrapper[4722]: healthz check failed Oct 02 09:35:10 crc kubenswrapper[4722]: I1002 09:35:10.722032 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xfhq" podUID="fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 09:35:11 crc kubenswrapper[4722]: I1002 09:35:11.107762 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jn9jw"] Oct 02 09:35:11 crc kubenswrapper[4722]: I1002 09:35:11.722370 4722 patch_prober.go:28] interesting pod/router-default-5444994796-6xfhq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 09:35:11 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Oct 02 09:35:11 crc kubenswrapper[4722]: [+]process-running ok Oct 02 09:35:11 crc kubenswrapper[4722]: healthz check failed Oct 02 09:35:11 crc kubenswrapper[4722]: I1002 09:35:11.722747 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xfhq" podUID="fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 09:35:12 crc kubenswrapper[4722]: I1002 09:35:12.721258 4722 patch_prober.go:28] interesting pod/router-default-5444994796-6xfhq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 09:35:12 crc kubenswrapper[4722]: [+]has-synced ok Oct 02 09:35:12 crc kubenswrapper[4722]: [+]process-running ok Oct 02 09:35:12 crc kubenswrapper[4722]: healthz check failed Oct 02 09:35:12 crc kubenswrapper[4722]: I1002 09:35:12.721375 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xfhq" podUID="fc04bfb1-a85e-43f5-8f5f-1a4c8fd0fe7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 09:35:13 crc kubenswrapper[4722]: I1002 09:35:13.730081 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-6xfhq" Oct 02 09:35:13 crc kubenswrapper[4722]: I1002 09:35:13.734900 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-6xfhq" Oct 02 09:35:14 crc kubenswrapper[4722]: E1002 09:35:14.989878 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: Get \"https://access.redhat.com/webassets/docker/content/sigstore/redhat/redhat-operator-index@sha256=000f2b2c7af84405b6a20472202d4019498854065ba44568e64bec72afad9ca8/signature-1\": net/http: TLS handshake timeout" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 02 09:35:14 crc kubenswrapper[4722]: E1002 09:35:14.990452 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h5cff,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-xl585_openshift-marketplace(463ed397-56f9-4b1b-b32b-4d2204f7632a): ErrImagePull: copying system image from manifest list: reading signatures: Get \"https://access.redhat.com/webassets/docker/content/sigstore/redhat/redhat-operator-index@sha256=000f2b2c7af84405b6a20472202d4019498854065ba44568e64bec72afad9ca8/signature-1\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 02 09:35:14 crc kubenswrapper[4722]: E1002 09:35:14.991642 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"copying system image from manifest list: reading signatures: Get \\\"https://access.redhat.com/webassets/docker/content/sigstore/redhat/redhat-operator-index@sha256=000f2b2c7af84405b6a20472202d4019498854065ba44568e64bec72afad9ca8/signature-1\\\": net/http: TLS handshake timeout\"" pod="openshift-marketplace/redhat-operators-xl585" podUID="463ed397-56f9-4b1b-b32b-4d2204f7632a" Oct 02 09:35:17 crc kubenswrapper[4722]: I1002 09:35:17.649972 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-clnnl" Oct 02 09:35:17 crc kubenswrapper[4722]: I1002 09:35:17.652983 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-mfhl4 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Oct 02 09:35:17 crc kubenswrapper[4722]: I1002 09:35:17.653071 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-mfhl4 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Oct 02 09:35:17 crc kubenswrapper[4722]: I1002 09:35:17.653117 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-mfhl4" podUID="7fb5f55e-6d4f-465a-9e93-45a36a2ce4b3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Oct 02 09:35:17 crc kubenswrapper[4722]: I1002 09:35:17.653063 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mfhl4" podUID="7fb5f55e-6d4f-465a-9e93-45a36a2ce4b3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Oct 02 09:35:17 crc kubenswrapper[4722]: I1002 09:35:17.653158 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-mfhl4" Oct 02 09:35:17 crc kubenswrapper[4722]: I1002 09:35:17.653735 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"51e90351ef7bd44d08f6a8388f22f9009a2f1812735662d94b322a17fefc008d"} pod="openshift-console/downloads-7954f5f757-mfhl4" containerMessage="Container download-server failed liveness probe, will be restarted" Oct 02 09:35:17 crc kubenswrapper[4722]: I1002 09:35:17.653859 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-mfhl4" podUID="7fb5f55e-6d4f-465a-9e93-45a36a2ce4b3" containerName="download-server" containerID="cri-o://51e90351ef7bd44d08f6a8388f22f9009a2f1812735662d94b322a17fefc008d" gracePeriod=2 Oct 02 09:35:17 crc kubenswrapper[4722]: I1002 09:35:17.653815 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-mfhl4 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Oct 02 09:35:17 crc kubenswrapper[4722]: I1002 09:35:17.653950 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mfhl4" podUID="7fb5f55e-6d4f-465a-9e93-45a36a2ce4b3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Oct 02 09:35:17 crc kubenswrapper[4722]: I1002 09:35:17.654083 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-clnnl" Oct 02 09:35:18 crc kubenswrapper[4722]: I1002 09:35:18.687544 4722 generic.go:334] "Generic (PLEG): container finished" podID="7fb5f55e-6d4f-465a-9e93-45a36a2ce4b3" containerID="51e90351ef7bd44d08f6a8388f22f9009a2f1812735662d94b322a17fefc008d" exitCode=0 Oct 02 09:35:18 crc kubenswrapper[4722]: I1002 09:35:18.687745 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-mfhl4" event={"ID":"7fb5f55e-6d4f-465a-9e93-45a36a2ce4b3","Type":"ContainerDied","Data":"51e90351ef7bd44d08f6a8388f22f9009a2f1812735662d94b322a17fefc008d"} Oct 02 09:35:21 crc kubenswrapper[4722]: I1002 09:35:21.538144 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:35:27 crc kubenswrapper[4722]: I1002 09:35:27.654175 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-mfhl4 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Oct 02 09:35:27 crc kubenswrapper[4722]: I1002 09:35:27.654682 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mfhl4" podUID="7fb5f55e-6d4f-465a-9e93-45a36a2ce4b3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Oct 02 09:35:27 crc kubenswrapper[4722]: I1002 09:35:27.912775 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kbnrw" Oct 02 09:35:30 crc kubenswrapper[4722]: E1002 09:35:30.099552 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 02 09:35:30 crc kubenswrapper[4722]: E1002 09:35:30.100965 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-72slq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-p9hf8_openshift-marketplace(1e7f1ec0-f82a-4911-b120-ba1c25a4a200): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 09:35:30 crc kubenswrapper[4722]: E1002 09:35:30.102358 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-p9hf8" podUID="1e7f1ec0-f82a-4911-b120-ba1c25a4a200" Oct 02 09:35:35 crc kubenswrapper[4722]: I1002 09:35:35.786403 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 09:35:36 crc kubenswrapper[4722]: I1002 09:35:36.792277 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jn9jw" event={"ID":"3d43b03f-1f3f-4824-9eda-93e405f6ee9e","Type":"ContainerStarted","Data":"476ce9932c8933bde18933d8c50e19c7e1cf8192babfee4fe6f783b48ea565f7"} Oct 02 09:35:37 crc kubenswrapper[4722]: I1002 09:35:37.652908 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-mfhl4 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Oct 02 09:35:37 crc kubenswrapper[4722]: I1002 09:35:37.652997 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mfhl4" podUID="7fb5f55e-6d4f-465a-9e93-45a36a2ce4b3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Oct 02 09:35:40 crc kubenswrapper[4722]: I1002 09:35:40.697343 4722 patch_prober.go:28] interesting pod/machine-config-daemon-q6h76 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 09:35:40 crc kubenswrapper[4722]: I1002 09:35:40.697413 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" podUID="f662826c-e772-4f56-a85f-83971aaef356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 09:35:41 crc kubenswrapper[4722]: E1002 09:35:41.486805 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 02 09:35:41 crc kubenswrapper[4722]: E1002 09:35:41.487099 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gz5v5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-n5224_openshift-marketplace(d1581831-b357-4fe6-8e8b-5bdbc057f867): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 09:35:41 crc kubenswrapper[4722]: E1002 09:35:41.488396 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-n5224" podUID="d1581831-b357-4fe6-8e8b-5bdbc057f867" Oct 02 09:35:42 crc kubenswrapper[4722]: E1002 09:35:42.472032 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-n5224" podUID="d1581831-b357-4fe6-8e8b-5bdbc057f867" Oct 02 09:35:42 crc kubenswrapper[4722]: E1002 09:35:42.567335 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 02 09:35:42 crc kubenswrapper[4722]: E1002 09:35:42.567530 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j26pq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-8zdkg_openshift-marketplace(ea335a73-5af7-4bd2-8439-cfa2304b594e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 09:35:42 crc kubenswrapper[4722]: E1002 09:35:42.568723 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-8zdkg" podUID="ea335a73-5af7-4bd2-8439-cfa2304b594e" Oct 02 09:35:42 crc kubenswrapper[4722]: E1002 09:35:42.724237 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 02 09:35:42 crc kubenswrapper[4722]: E1002 09:35:42.724423 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cphrq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-9f2qr_openshift-marketplace(5c352a70-fb33-4e66-a950-0e831abd5d45): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 09:35:42 crc kubenswrapper[4722]: E1002 09:35:42.725647 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-9f2qr" podUID="5c352a70-fb33-4e66-a950-0e831abd5d45" Oct 02 09:35:47 crc kubenswrapper[4722]: I1002 09:35:47.652458 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-mfhl4 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Oct 02 09:35:47 crc kubenswrapper[4722]: I1002 09:35:47.652772 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mfhl4" podUID="7fb5f55e-6d4f-465a-9e93-45a36a2ce4b3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Oct 02 09:35:50 crc kubenswrapper[4722]: E1002 09:35:50.220919 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-8zdkg" podUID="ea335a73-5af7-4bd2-8439-cfa2304b594e" Oct 02 09:35:50 crc kubenswrapper[4722]: E1002 09:35:50.221044 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-9f2qr" podUID="5c352a70-fb33-4e66-a950-0e831abd5d45" Oct 02 09:35:51 crc kubenswrapper[4722]: E1002 09:35:51.325419 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 02 09:35:51 crc kubenswrapper[4722]: E1002 09:35:51.325635 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4f9ng,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-n2vk2_openshift-marketplace(d2ad2457-159c-4841-82fe-ea109055fe28): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 09:35:51 crc kubenswrapper[4722]: E1002 09:35:51.326821 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-n2vk2" podUID="d2ad2457-159c-4841-82fe-ea109055fe28" Oct 02 09:35:57 crc kubenswrapper[4722]: I1002 09:35:57.653787 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-mfhl4 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Oct 02 09:35:57 crc kubenswrapper[4722]: I1002 09:35:57.654374 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mfhl4" podUID="7fb5f55e-6d4f-465a-9e93-45a36a2ce4b3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Oct 02 09:36:04 crc kubenswrapper[4722]: E1002 09:36:04.705628 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 02 09:36:04 crc kubenswrapper[4722]: E1002 09:36:04.706694 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-72slq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-p9hf8_openshift-marketplace(1e7f1ec0-f82a-4911-b120-ba1c25a4a200): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 09:36:04 crc kubenswrapper[4722]: E1002 09:36:04.708048 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-p9hf8" podUID="1e7f1ec0-f82a-4911-b120-ba1c25a4a200" Oct 02 09:36:04 crc kubenswrapper[4722]: E1002 09:36:04.832566 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 02 09:36:04 crc kubenswrapper[4722]: E1002 09:36:04.833028 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h5cff,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-xl585_openshift-marketplace(463ed397-56f9-4b1b-b32b-4d2204f7632a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 09:36:04 crc kubenswrapper[4722]: E1002 09:36:04.834099 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-xl585" podUID="463ed397-56f9-4b1b-b32b-4d2204f7632a" Oct 02 09:36:04 crc kubenswrapper[4722]: E1002 09:36:04.847076 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 02 09:36:04 crc kubenswrapper[4722]: E1002 09:36:04.847282 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tg9mx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-x8b9l_openshift-marketplace(a1c11523-129d-450b-bd1f-b72d6a019660): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 09:36:04 crc kubenswrapper[4722]: E1002 09:36:04.849180 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-x8b9l" podUID="a1c11523-129d-450b-bd1f-b72d6a019660" Oct 02 09:36:04 crc kubenswrapper[4722]: E1002 09:36:04.930277 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 02 09:36:04 crc kubenswrapper[4722]: E1002 09:36:04.930521 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4kgkb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-vpgk2_openshift-marketplace(52ea146c-d933-4406-985c-7ec1c9386710): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 09:36:04 crc kubenswrapper[4722]: E1002 09:36:04.931749 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-vpgk2" podUID="52ea146c-d933-4406-985c-7ec1c9386710" Oct 02 09:36:04 crc kubenswrapper[4722]: E1002 09:36:04.944910 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-x8b9l" podUID="a1c11523-129d-450b-bd1f-b72d6a019660" Oct 02 09:36:04 crc kubenswrapper[4722]: E1002 09:36:04.945366 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-p9hf8" podUID="1e7f1ec0-f82a-4911-b120-ba1c25a4a200" Oct 02 09:36:05 crc kubenswrapper[4722]: I1002 09:36:05.949631 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jn9jw" event={"ID":"3d43b03f-1f3f-4824-9eda-93e405f6ee9e","Type":"ContainerStarted","Data":"4ff36088ada0def09d0908f26cd89a63b9f52dbd0e8f8118e45722ceb65d06ab"} Oct 02 09:36:05 crc kubenswrapper[4722]: I1002 09:36:05.950104 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jn9jw" event={"ID":"3d43b03f-1f3f-4824-9eda-93e405f6ee9e","Type":"ContainerStarted","Data":"ce5382f117d7807ca9c98e9d2cb80bd552eae3aa4ba0c34e5f0518e416a2ed89"} Oct 02 09:36:05 crc kubenswrapper[4722]: I1002 09:36:05.953703 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-mfhl4" event={"ID":"7fb5f55e-6d4f-465a-9e93-45a36a2ce4b3","Type":"ContainerStarted","Data":"2030861724c5e6d2b9f23d0b73fc6438c1d5122d2d011e8d783427df7ba23c90"} Oct 02 09:36:05 crc kubenswrapper[4722]: I1002 09:36:05.953904 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-mfhl4" Oct 02 09:36:05 crc kubenswrapper[4722]: I1002 09:36:05.954464 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-mfhl4 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Oct 02 09:36:05 crc kubenswrapper[4722]: I1002 09:36:05.954534 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mfhl4" podUID="7fb5f55e-6d4f-465a-9e93-45a36a2ce4b3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Oct 02 09:36:05 crc kubenswrapper[4722]: I1002 09:36:05.955632 4722 generic.go:334] "Generic (PLEG): container finished" podID="d2ad2457-159c-4841-82fe-ea109055fe28" containerID="aebd8e60f89edd498d0e2754009812eddc78fe2725b3d8a029cf6714fe068033" exitCode=0 Oct 02 09:36:05 crc kubenswrapper[4722]: I1002 09:36:05.955680 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2vk2" event={"ID":"d2ad2457-159c-4841-82fe-ea109055fe28","Type":"ContainerDied","Data":"aebd8e60f89edd498d0e2754009812eddc78fe2725b3d8a029cf6714fe068033"} Oct 02 09:36:05 crc kubenswrapper[4722]: I1002 09:36:05.976178 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-jn9jw" podStartSLOduration=198.976143235 podStartE2EDuration="3m18.976143235s" podCreationTimestamp="2025-10-02 09:32:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:36:05.971961577 +0000 UTC m=+222.008714577" watchObservedRunningTime="2025-10-02 09:36:05.976143235 +0000 UTC m=+222.012896255" Oct 02 09:36:06 crc kubenswrapper[4722]: I1002 09:36:06.963628 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8zdkg" event={"ID":"ea335a73-5af7-4bd2-8439-cfa2304b594e","Type":"ContainerStarted","Data":"048acbc67111768e95c693255cf3f737daf9bd8bcf773d4f30d77e8ae7c5ad79"} Oct 02 09:36:06 crc kubenswrapper[4722]: I1002 09:36:06.967276 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n5224" event={"ID":"d1581831-b357-4fe6-8e8b-5bdbc057f867","Type":"ContainerStarted","Data":"127a3465b19f23eb758277bfe7fa9233486721a9ea274c8815e6e8c218a546cc"} Oct 02 09:36:06 crc kubenswrapper[4722]: I1002 09:36:06.971087 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9f2qr" event={"ID":"5c352a70-fb33-4e66-a950-0e831abd5d45","Type":"ContainerStarted","Data":"a56d6926deffff472a1633688231c11ea6a6dfd36281317320dbcd3c432d1918"} Oct 02 09:36:06 crc kubenswrapper[4722]: I1002 09:36:06.971926 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-mfhl4 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Oct 02 09:36:06 crc kubenswrapper[4722]: I1002 09:36:06.971962 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mfhl4" podUID="7fb5f55e-6d4f-465a-9e93-45a36a2ce4b3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Oct 02 09:36:07 crc kubenswrapper[4722]: I1002 09:36:07.653206 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-mfhl4 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Oct 02 09:36:07 crc kubenswrapper[4722]: I1002 09:36:07.653616 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mfhl4" podUID="7fb5f55e-6d4f-465a-9e93-45a36a2ce4b3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Oct 02 09:36:07 crc kubenswrapper[4722]: I1002 09:36:07.653276 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-mfhl4 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Oct 02 09:36:07 crc kubenswrapper[4722]: I1002 09:36:07.653685 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-mfhl4" podUID="7fb5f55e-6d4f-465a-9e93-45a36a2ce4b3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Oct 02 09:36:07 crc kubenswrapper[4722]: I1002 09:36:07.977484 4722 generic.go:334] "Generic (PLEG): container finished" podID="ea335a73-5af7-4bd2-8439-cfa2304b594e" containerID="048acbc67111768e95c693255cf3f737daf9bd8bcf773d4f30d77e8ae7c5ad79" exitCode=0 Oct 02 09:36:07 crc kubenswrapper[4722]: I1002 09:36:07.977605 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8zdkg" event={"ID":"ea335a73-5af7-4bd2-8439-cfa2304b594e","Type":"ContainerDied","Data":"048acbc67111768e95c693255cf3f737daf9bd8bcf773d4f30d77e8ae7c5ad79"} Oct 02 09:36:07 crc kubenswrapper[4722]: I1002 09:36:07.986101 4722 generic.go:334] "Generic (PLEG): container finished" podID="d1581831-b357-4fe6-8e8b-5bdbc057f867" containerID="127a3465b19f23eb758277bfe7fa9233486721a9ea274c8815e6e8c218a546cc" exitCode=0 Oct 02 09:36:07 crc kubenswrapper[4722]: I1002 09:36:07.986220 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n5224" event={"ID":"d1581831-b357-4fe6-8e8b-5bdbc057f867","Type":"ContainerDied","Data":"127a3465b19f23eb758277bfe7fa9233486721a9ea274c8815e6e8c218a546cc"} Oct 02 09:36:07 crc kubenswrapper[4722]: I1002 09:36:07.991880 4722 generic.go:334] "Generic (PLEG): container finished" podID="5c352a70-fb33-4e66-a950-0e831abd5d45" containerID="a56d6926deffff472a1633688231c11ea6a6dfd36281317320dbcd3c432d1918" exitCode=0 Oct 02 09:36:07 crc kubenswrapper[4722]: I1002 09:36:07.991968 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9f2qr" event={"ID":"5c352a70-fb33-4e66-a950-0e831abd5d45","Type":"ContainerDied","Data":"a56d6926deffff472a1633688231c11ea6a6dfd36281317320dbcd3c432d1918"} Oct 02 09:36:09 crc kubenswrapper[4722]: I1002 09:36:09.003193 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2vk2" event={"ID":"d2ad2457-159c-4841-82fe-ea109055fe28","Type":"ContainerStarted","Data":"c09cf7f0eff32335f09b45a808a3b9080c9ff115555c4027130da6889a0b1c49"} Oct 02 09:36:10 crc kubenswrapper[4722]: I1002 09:36:10.029645 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n2vk2" podStartSLOduration=5.692854046 podStartE2EDuration="1m11.029620827s" podCreationTimestamp="2025-10-02 09:34:59 +0000 UTC" firstStartedPulling="2025-10-02 09:35:02.476103734 +0000 UTC m=+158.512856714" lastFinishedPulling="2025-10-02 09:36:07.812870515 +0000 UTC m=+223.849623495" observedRunningTime="2025-10-02 09:36:10.026449014 +0000 UTC m=+226.063202004" watchObservedRunningTime="2025-10-02 09:36:10.029620827 +0000 UTC m=+226.066373807" Oct 02 09:36:10 crc kubenswrapper[4722]: I1002 09:36:10.696436 4722 patch_prober.go:28] interesting pod/machine-config-daemon-q6h76 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 09:36:10 crc kubenswrapper[4722]: I1002 09:36:10.696505 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" podUID="f662826c-e772-4f56-a85f-83971aaef356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 09:36:10 crc kubenswrapper[4722]: I1002 09:36:10.696556 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" Oct 02 09:36:10 crc kubenswrapper[4722]: I1002 09:36:10.697206 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cfc66c4529a0875a210fbc2b5af96db36ab208d651068958075de46898c1e150"} pod="openshift-machine-config-operator/machine-config-daemon-q6h76" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 09:36:10 crc kubenswrapper[4722]: I1002 09:36:10.697266 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" podUID="f662826c-e772-4f56-a85f-83971aaef356" containerName="machine-config-daemon" containerID="cri-o://cfc66c4529a0875a210fbc2b5af96db36ab208d651068958075de46898c1e150" gracePeriod=600 Oct 02 09:36:11 crc kubenswrapper[4722]: I1002 09:36:11.017957 4722 generic.go:334] "Generic (PLEG): container finished" podID="f662826c-e772-4f56-a85f-83971aaef356" containerID="cfc66c4529a0875a210fbc2b5af96db36ab208d651068958075de46898c1e150" exitCode=0 Oct 02 09:36:11 crc kubenswrapper[4722]: I1002 09:36:11.018023 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" event={"ID":"f662826c-e772-4f56-a85f-83971aaef356","Type":"ContainerDied","Data":"cfc66c4529a0875a210fbc2b5af96db36ab208d651068958075de46898c1e150"} Oct 02 09:36:12 crc kubenswrapper[4722]: I1002 09:36:12.027515 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n5224" event={"ID":"d1581831-b357-4fe6-8e8b-5bdbc057f867","Type":"ContainerStarted","Data":"0d89a5c864fee68e1f6430d686a88a8cb3059a74e559af3fb4423f97c3b79ff8"} Oct 02 09:36:12 crc kubenswrapper[4722]: I1002 09:36:12.048841 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-n5224" podStartSLOduration=3.44264496 podStartE2EDuration="1m16.048812531s" podCreationTimestamp="2025-10-02 09:34:56 +0000 UTC" firstStartedPulling="2025-10-02 09:34:58.219487771 +0000 UTC m=+154.256240751" lastFinishedPulling="2025-10-02 09:36:10.825655332 +0000 UTC m=+226.862408322" observedRunningTime="2025-10-02 09:36:12.047598529 +0000 UTC m=+228.084351549" watchObservedRunningTime="2025-10-02 09:36:12.048812531 +0000 UTC m=+228.085565541" Oct 02 09:36:15 crc kubenswrapper[4722]: E1002 09:36:15.830197 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-xl585" podUID="463ed397-56f9-4b1b-b32b-4d2204f7632a" Oct 02 09:36:15 crc kubenswrapper[4722]: E1002 09:36:15.831155 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-p9hf8" podUID="1e7f1ec0-f82a-4911-b120-ba1c25a4a200" Oct 02 09:36:16 crc kubenswrapper[4722]: I1002 09:36:16.050597 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" event={"ID":"f662826c-e772-4f56-a85f-83971aaef356","Type":"ContainerStarted","Data":"312c817917c93b621d9ce37d611efa243a39c268516486ea8c814510b384738e"} Oct 02 09:36:16 crc kubenswrapper[4722]: I1002 09:36:16.828368 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-n5224" Oct 02 09:36:16 crc kubenswrapper[4722]: I1002 09:36:16.829201 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-n5224" Oct 02 09:36:17 crc kubenswrapper[4722]: I1002 09:36:17.059973 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9f2qr" event={"ID":"5c352a70-fb33-4e66-a950-0e831abd5d45","Type":"ContainerStarted","Data":"130c61d4452e7cb63135189227fd8440bd431100dab795b0ad174be593a586db"} Oct 02 09:36:17 crc kubenswrapper[4722]: I1002 09:36:17.064137 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8zdkg" event={"ID":"ea335a73-5af7-4bd2-8439-cfa2304b594e","Type":"ContainerStarted","Data":"ddc867f020c3b2e2572582c66e91c6fe43b09239cb1802495704d5e6d82c08c9"} Oct 02 09:36:17 crc kubenswrapper[4722]: I1002 09:36:17.082938 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9f2qr" podStartSLOduration=3.500631065 podStartE2EDuration="1m21.082914379s" podCreationTimestamp="2025-10-02 09:34:56 +0000 UTC" firstStartedPulling="2025-10-02 09:34:58.248930715 +0000 UTC m=+154.285683695" lastFinishedPulling="2025-10-02 09:36:15.831214029 +0000 UTC m=+231.867967009" observedRunningTime="2025-10-02 09:36:17.079356106 +0000 UTC m=+233.116109106" watchObservedRunningTime="2025-10-02 09:36:17.082914379 +0000 UTC m=+233.119667359" Oct 02 09:36:17 crc kubenswrapper[4722]: I1002 09:36:17.104840 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-n5224" Oct 02 09:36:17 crc kubenswrapper[4722]: I1002 09:36:17.108366 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8zdkg" podStartSLOduration=4.424326764 podStartE2EDuration="1m21.108342728s" podCreationTimestamp="2025-10-02 09:34:56 +0000 UTC" firstStartedPulling="2025-10-02 09:34:58.21482189 +0000 UTC m=+154.251574870" lastFinishedPulling="2025-10-02 09:36:14.898837854 +0000 UTC m=+230.935590834" observedRunningTime="2025-10-02 09:36:17.103616836 +0000 UTC m=+233.140369836" watchObservedRunningTime="2025-10-02 09:36:17.108342728 +0000 UTC m=+233.145095708" Oct 02 09:36:17 crc kubenswrapper[4722]: I1002 09:36:17.160358 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-n5224" Oct 02 09:36:17 crc kubenswrapper[4722]: I1002 09:36:17.660379 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-mfhl4" Oct 02 09:36:19 crc kubenswrapper[4722]: I1002 09:36:19.141842 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n5224"] Oct 02 09:36:19 crc kubenswrapper[4722]: I1002 09:36:19.142104 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-n5224" podUID="d1581831-b357-4fe6-8e8b-5bdbc057f867" containerName="registry-server" containerID="cri-o://0d89a5c864fee68e1f6430d686a88a8cb3059a74e559af3fb4423f97c3b79ff8" gracePeriod=2 Oct 02 09:36:19 crc kubenswrapper[4722]: I1002 09:36:19.408490 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n2vk2" Oct 02 09:36:19 crc kubenswrapper[4722]: I1002 09:36:19.408542 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n2vk2" Oct 02 09:36:19 crc kubenswrapper[4722]: I1002 09:36:19.448708 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n2vk2" Oct 02 09:36:20 crc kubenswrapper[4722]: I1002 09:36:20.084249 4722 generic.go:334] "Generic (PLEG): container finished" podID="d1581831-b357-4fe6-8e8b-5bdbc057f867" containerID="0d89a5c864fee68e1f6430d686a88a8cb3059a74e559af3fb4423f97c3b79ff8" exitCode=0 Oct 02 09:36:20 crc kubenswrapper[4722]: I1002 09:36:20.084336 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n5224" event={"ID":"d1581831-b357-4fe6-8e8b-5bdbc057f867","Type":"ContainerDied","Data":"0d89a5c864fee68e1f6430d686a88a8cb3059a74e559af3fb4423f97c3b79ff8"} Oct 02 09:36:20 crc kubenswrapper[4722]: I1002 09:36:20.129655 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n2vk2" Oct 02 09:36:20 crc kubenswrapper[4722]: I1002 09:36:20.334796 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n5224" Oct 02 09:36:20 crc kubenswrapper[4722]: I1002 09:36:20.426361 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gz5v5\" (UniqueName: \"kubernetes.io/projected/d1581831-b357-4fe6-8e8b-5bdbc057f867-kube-api-access-gz5v5\") pod \"d1581831-b357-4fe6-8e8b-5bdbc057f867\" (UID: \"d1581831-b357-4fe6-8e8b-5bdbc057f867\") " Oct 02 09:36:20 crc kubenswrapper[4722]: I1002 09:36:20.426429 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1581831-b357-4fe6-8e8b-5bdbc057f867-utilities\") pod \"d1581831-b357-4fe6-8e8b-5bdbc057f867\" (UID: \"d1581831-b357-4fe6-8e8b-5bdbc057f867\") " Oct 02 09:36:20 crc kubenswrapper[4722]: I1002 09:36:20.426495 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1581831-b357-4fe6-8e8b-5bdbc057f867-catalog-content\") pod \"d1581831-b357-4fe6-8e8b-5bdbc057f867\" (UID: \"d1581831-b357-4fe6-8e8b-5bdbc057f867\") " Oct 02 09:36:20 crc kubenswrapper[4722]: I1002 09:36:20.427629 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1581831-b357-4fe6-8e8b-5bdbc057f867-utilities" (OuterVolumeSpecName: "utilities") pod "d1581831-b357-4fe6-8e8b-5bdbc057f867" (UID: "d1581831-b357-4fe6-8e8b-5bdbc057f867"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:36:20 crc kubenswrapper[4722]: I1002 09:36:20.433807 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1581831-b357-4fe6-8e8b-5bdbc057f867-kube-api-access-gz5v5" (OuterVolumeSpecName: "kube-api-access-gz5v5") pod "d1581831-b357-4fe6-8e8b-5bdbc057f867" (UID: "d1581831-b357-4fe6-8e8b-5bdbc057f867"). InnerVolumeSpecName "kube-api-access-gz5v5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:36:20 crc kubenswrapper[4722]: I1002 09:36:20.506494 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1581831-b357-4fe6-8e8b-5bdbc057f867-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d1581831-b357-4fe6-8e8b-5bdbc057f867" (UID: "d1581831-b357-4fe6-8e8b-5bdbc057f867"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:36:20 crc kubenswrapper[4722]: I1002 09:36:20.527803 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gz5v5\" (UniqueName: \"kubernetes.io/projected/d1581831-b357-4fe6-8e8b-5bdbc057f867-kube-api-access-gz5v5\") on node \"crc\" DevicePath \"\"" Oct 02 09:36:20 crc kubenswrapper[4722]: I1002 09:36:20.527845 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1581831-b357-4fe6-8e8b-5bdbc057f867-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 09:36:20 crc kubenswrapper[4722]: I1002 09:36:20.527858 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1581831-b357-4fe6-8e8b-5bdbc057f867-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 09:36:21 crc kubenswrapper[4722]: I1002 09:36:21.094422 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n5224" event={"ID":"d1581831-b357-4fe6-8e8b-5bdbc057f867","Type":"ContainerDied","Data":"30f3395a5b5bffa19d8d4592594ab592221b91e1bc87ba000560e4f3555c48d1"} Oct 02 09:36:21 crc kubenswrapper[4722]: I1002 09:36:21.094497 4722 scope.go:117] "RemoveContainer" containerID="0d89a5c864fee68e1f6430d686a88a8cb3059a74e559af3fb4423f97c3b79ff8" Oct 02 09:36:21 crc kubenswrapper[4722]: I1002 09:36:21.094553 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n5224" Oct 02 09:36:21 crc kubenswrapper[4722]: I1002 09:36:21.117285 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n5224"] Oct 02 09:36:21 crc kubenswrapper[4722]: I1002 09:36:21.129684 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-n5224"] Oct 02 09:36:22 crc kubenswrapper[4722]: I1002 09:36:22.717237 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1581831-b357-4fe6-8e8b-5bdbc057f867" path="/var/lib/kubelet/pods/d1581831-b357-4fe6-8e8b-5bdbc057f867/volumes" Oct 02 09:36:26 crc kubenswrapper[4722]: I1002 09:36:26.383933 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9f2qr" Oct 02 09:36:26 crc kubenswrapper[4722]: I1002 09:36:26.384800 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9f2qr" Oct 02 09:36:26 crc kubenswrapper[4722]: I1002 09:36:26.435366 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9f2qr" Oct 02 09:36:26 crc kubenswrapper[4722]: I1002 09:36:26.870541 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8zdkg" Oct 02 09:36:26 crc kubenswrapper[4722]: I1002 09:36:26.870850 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8zdkg" Oct 02 09:36:26 crc kubenswrapper[4722]: I1002 09:36:26.916147 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8zdkg" Oct 02 09:36:27 crc kubenswrapper[4722]: I1002 09:36:27.167659 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9f2qr" Oct 02 09:36:27 crc kubenswrapper[4722]: I1002 09:36:27.189657 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8zdkg" Oct 02 09:36:27 crc kubenswrapper[4722]: I1002 09:36:27.573646 4722 scope.go:117] "RemoveContainer" containerID="127a3465b19f23eb758277bfe7fa9233486721a9ea274c8815e6e8c218a546cc" Oct 02 09:36:27 crc kubenswrapper[4722]: I1002 09:36:27.745984 4722 scope.go:117] "RemoveContainer" containerID="bae21b5465783ca8558d73e5ad1491c543547988abfa477b2be37e8222c45e2c" Oct 02 09:36:28 crc kubenswrapper[4722]: I1002 09:36:28.136291 4722 generic.go:334] "Generic (PLEG): container finished" podID="a1c11523-129d-450b-bd1f-b72d6a019660" containerID="73febdb5045fdc3021b0236dd2fed29399bd0fbeffdefe8c31c73e728f4ee2ce" exitCode=0 Oct 02 09:36:28 crc kubenswrapper[4722]: I1002 09:36:28.137048 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x8b9l" event={"ID":"a1c11523-129d-450b-bd1f-b72d6a019660","Type":"ContainerDied","Data":"73febdb5045fdc3021b0236dd2fed29399bd0fbeffdefe8c31c73e728f4ee2ce"} Oct 02 09:36:28 crc kubenswrapper[4722]: I1002 09:36:28.141452 4722 generic.go:334] "Generic (PLEG): container finished" podID="52ea146c-d933-4406-985c-7ec1c9386710" containerID="2503513ae58c1b52627c05b1f42c59569a3732e552438906acb868a959b269ed" exitCode=0 Oct 02 09:36:28 crc kubenswrapper[4722]: I1002 09:36:28.142151 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vpgk2" event={"ID":"52ea146c-d933-4406-985c-7ec1c9386710","Type":"ContainerDied","Data":"2503513ae58c1b52627c05b1f42c59569a3732e552438906acb868a959b269ed"} Oct 02 09:36:29 crc kubenswrapper[4722]: I1002 09:36:29.148827 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vpgk2" event={"ID":"52ea146c-d933-4406-985c-7ec1c9386710","Type":"ContainerStarted","Data":"ea1b574e997b322a13e0caff4e95eea6a88eb8fac5cdb1cc0bcbea1d4710d054"} Oct 02 09:36:29 crc kubenswrapper[4722]: I1002 09:36:29.153727 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xl585" event={"ID":"463ed397-56f9-4b1b-b32b-4d2204f7632a","Type":"ContainerStarted","Data":"535b0b1e0c8955415776fe3b5f95596c255019f934e344a06ca276193fa278bb"} Oct 02 09:36:29 crc kubenswrapper[4722]: I1002 09:36:29.157203 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x8b9l" event={"ID":"a1c11523-129d-450b-bd1f-b72d6a019660","Type":"ContainerStarted","Data":"6cd65f97545e9af53c7100c933165842f6c8a81cd57fc93f761084d0aceb1617"} Oct 02 09:36:29 crc kubenswrapper[4722]: I1002 09:36:29.174222 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vpgk2" podStartSLOduration=4.759111888 podStartE2EDuration="1m31.174200808s" podCreationTimestamp="2025-10-02 09:34:58 +0000 UTC" firstStartedPulling="2025-10-02 09:35:02.476370811 +0000 UTC m=+158.513123791" lastFinishedPulling="2025-10-02 09:36:28.891459741 +0000 UTC m=+244.928212711" observedRunningTime="2025-10-02 09:36:29.169126296 +0000 UTC m=+245.205879286" watchObservedRunningTime="2025-10-02 09:36:29.174200808 +0000 UTC m=+245.210953788" Oct 02 09:36:29 crc kubenswrapper[4722]: I1002 09:36:29.189795 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x8b9l" podStartSLOduration=2.864383323 podStartE2EDuration="1m31.189756712s" podCreationTimestamp="2025-10-02 09:34:58 +0000 UTC" firstStartedPulling="2025-10-02 09:35:00.370516517 +0000 UTC m=+156.407269497" lastFinishedPulling="2025-10-02 09:36:28.695889906 +0000 UTC m=+244.732642886" observedRunningTime="2025-10-02 09:36:29.187484913 +0000 UTC m=+245.224237913" watchObservedRunningTime="2025-10-02 09:36:29.189756712 +0000 UTC m=+245.226509682" Oct 02 09:36:30 crc kubenswrapper[4722]: I1002 09:36:30.084307 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8zdkg"] Oct 02 09:36:30 crc kubenswrapper[4722]: I1002 09:36:30.165743 4722 generic.go:334] "Generic (PLEG): container finished" podID="463ed397-56f9-4b1b-b32b-4d2204f7632a" containerID="535b0b1e0c8955415776fe3b5f95596c255019f934e344a06ca276193fa278bb" exitCode=0 Oct 02 09:36:30 crc kubenswrapper[4722]: I1002 09:36:30.165838 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xl585" event={"ID":"463ed397-56f9-4b1b-b32b-4d2204f7632a","Type":"ContainerDied","Data":"535b0b1e0c8955415776fe3b5f95596c255019f934e344a06ca276193fa278bb"} Oct 02 09:36:30 crc kubenswrapper[4722]: I1002 09:36:30.168567 4722 generic.go:334] "Generic (PLEG): container finished" podID="1e7f1ec0-f82a-4911-b120-ba1c25a4a200" containerID="0485a102bafff7273eb0d38e8d409f6e271c5eeefa8eb4224921d464b0c19860" exitCode=0 Oct 02 09:36:30 crc kubenswrapper[4722]: I1002 09:36:30.168677 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p9hf8" event={"ID":"1e7f1ec0-f82a-4911-b120-ba1c25a4a200","Type":"ContainerDied","Data":"0485a102bafff7273eb0d38e8d409f6e271c5eeefa8eb4224921d464b0c19860"} Oct 02 09:36:30 crc kubenswrapper[4722]: I1002 09:36:30.168811 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8zdkg" podUID="ea335a73-5af7-4bd2-8439-cfa2304b594e" containerName="registry-server" containerID="cri-o://ddc867f020c3b2e2572582c66e91c6fe43b09239cb1802495704d5e6d82c08c9" gracePeriod=2 Oct 02 09:36:30 crc kubenswrapper[4722]: I1002 09:36:30.544348 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8zdkg" Oct 02 09:36:30 crc kubenswrapper[4722]: I1002 09:36:30.684949 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea335a73-5af7-4bd2-8439-cfa2304b594e-catalog-content\") pod \"ea335a73-5af7-4bd2-8439-cfa2304b594e\" (UID: \"ea335a73-5af7-4bd2-8439-cfa2304b594e\") " Oct 02 09:36:30 crc kubenswrapper[4722]: I1002 09:36:30.685048 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j26pq\" (UniqueName: \"kubernetes.io/projected/ea335a73-5af7-4bd2-8439-cfa2304b594e-kube-api-access-j26pq\") pod \"ea335a73-5af7-4bd2-8439-cfa2304b594e\" (UID: \"ea335a73-5af7-4bd2-8439-cfa2304b594e\") " Oct 02 09:36:30 crc kubenswrapper[4722]: I1002 09:36:30.685178 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea335a73-5af7-4bd2-8439-cfa2304b594e-utilities\") pod \"ea335a73-5af7-4bd2-8439-cfa2304b594e\" (UID: \"ea335a73-5af7-4bd2-8439-cfa2304b594e\") " Oct 02 09:36:30 crc kubenswrapper[4722]: I1002 09:36:30.686159 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea335a73-5af7-4bd2-8439-cfa2304b594e-utilities" (OuterVolumeSpecName: "utilities") pod "ea335a73-5af7-4bd2-8439-cfa2304b594e" (UID: "ea335a73-5af7-4bd2-8439-cfa2304b594e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:36:30 crc kubenswrapper[4722]: I1002 09:36:30.691526 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea335a73-5af7-4bd2-8439-cfa2304b594e-kube-api-access-j26pq" (OuterVolumeSpecName: "kube-api-access-j26pq") pod "ea335a73-5af7-4bd2-8439-cfa2304b594e" (UID: "ea335a73-5af7-4bd2-8439-cfa2304b594e"). InnerVolumeSpecName "kube-api-access-j26pq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:36:30 crc kubenswrapper[4722]: I1002 09:36:30.742975 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea335a73-5af7-4bd2-8439-cfa2304b594e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ea335a73-5af7-4bd2-8439-cfa2304b594e" (UID: "ea335a73-5af7-4bd2-8439-cfa2304b594e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:36:30 crc kubenswrapper[4722]: I1002 09:36:30.787031 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea335a73-5af7-4bd2-8439-cfa2304b594e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 09:36:30 crc kubenswrapper[4722]: I1002 09:36:30.787079 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j26pq\" (UniqueName: \"kubernetes.io/projected/ea335a73-5af7-4bd2-8439-cfa2304b594e-kube-api-access-j26pq\") on node \"crc\" DevicePath \"\"" Oct 02 09:36:30 crc kubenswrapper[4722]: I1002 09:36:30.787093 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea335a73-5af7-4bd2-8439-cfa2304b594e-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 09:36:31 crc kubenswrapper[4722]: I1002 09:36:31.179860 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xl585" event={"ID":"463ed397-56f9-4b1b-b32b-4d2204f7632a","Type":"ContainerStarted","Data":"6da6ea9f01b78b85f3f4fd838c10ce01336f16d170590c1e3804666b832efe58"} Oct 02 09:36:31 crc kubenswrapper[4722]: I1002 09:36:31.181739 4722 generic.go:334] "Generic (PLEG): container finished" podID="ea335a73-5af7-4bd2-8439-cfa2304b594e" containerID="ddc867f020c3b2e2572582c66e91c6fe43b09239cb1802495704d5e6d82c08c9" exitCode=0 Oct 02 09:36:31 crc kubenswrapper[4722]: I1002 09:36:31.181793 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8zdkg" event={"ID":"ea335a73-5af7-4bd2-8439-cfa2304b594e","Type":"ContainerDied","Data":"ddc867f020c3b2e2572582c66e91c6fe43b09239cb1802495704d5e6d82c08c9"} Oct 02 09:36:31 crc kubenswrapper[4722]: I1002 09:36:31.181814 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8zdkg" event={"ID":"ea335a73-5af7-4bd2-8439-cfa2304b594e","Type":"ContainerDied","Data":"465382d855915037021a53e4c830224d30a576d92c40686fee617f4873c4c5cf"} Oct 02 09:36:31 crc kubenswrapper[4722]: I1002 09:36:31.181838 4722 scope.go:117] "RemoveContainer" containerID="ddc867f020c3b2e2572582c66e91c6fe43b09239cb1802495704d5e6d82c08c9" Oct 02 09:36:31 crc kubenswrapper[4722]: I1002 09:36:31.181873 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8zdkg" Oct 02 09:36:31 crc kubenswrapper[4722]: I1002 09:36:31.189643 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p9hf8" event={"ID":"1e7f1ec0-f82a-4911-b120-ba1c25a4a200","Type":"ContainerStarted","Data":"da5f92db22136c6da85e2e7bf530e950bc0cdafdab8b82f0df50cf88c7e72212"} Oct 02 09:36:31 crc kubenswrapper[4722]: I1002 09:36:31.196471 4722 scope.go:117] "RemoveContainer" containerID="048acbc67111768e95c693255cf3f737daf9bd8bcf773d4f30d77e8ae7c5ad79" Oct 02 09:36:31 crc kubenswrapper[4722]: I1002 09:36:31.209081 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xl585" podStartSLOduration=5.129589569 podStartE2EDuration="1m32.20905611s" podCreationTimestamp="2025-10-02 09:34:59 +0000 UTC" firstStartedPulling="2025-10-02 09:35:03.535043981 +0000 UTC m=+159.571796961" lastFinishedPulling="2025-10-02 09:36:30.614510522 +0000 UTC m=+246.651263502" observedRunningTime="2025-10-02 09:36:31.204886122 +0000 UTC m=+247.241639102" watchObservedRunningTime="2025-10-02 09:36:31.20905611 +0000 UTC m=+247.245809100" Oct 02 09:36:31 crc kubenswrapper[4722]: I1002 09:36:31.230201 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8zdkg"] Oct 02 09:36:31 crc kubenswrapper[4722]: I1002 09:36:31.233089 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8zdkg"] Oct 02 09:36:31 crc kubenswrapper[4722]: I1002 09:36:31.247438 4722 scope.go:117] "RemoveContainer" containerID="f906d18d2c1bdedaa8e9e988a10ecd0dc857be4917160e961e04c83fc9b01e55" Oct 02 09:36:31 crc kubenswrapper[4722]: I1002 09:36:31.252700 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-p9hf8" podStartSLOduration=3.681098269 podStartE2EDuration="1m36.252672112s" podCreationTimestamp="2025-10-02 09:34:55 +0000 UTC" firstStartedPulling="2025-10-02 09:34:58.223467214 +0000 UTC m=+154.260220184" lastFinishedPulling="2025-10-02 09:36:30.795041047 +0000 UTC m=+246.831794027" observedRunningTime="2025-10-02 09:36:31.248099753 +0000 UTC m=+247.284852753" watchObservedRunningTime="2025-10-02 09:36:31.252672112 +0000 UTC m=+247.289425092" Oct 02 09:36:31 crc kubenswrapper[4722]: I1002 09:36:31.262727 4722 scope.go:117] "RemoveContainer" containerID="ddc867f020c3b2e2572582c66e91c6fe43b09239cb1802495704d5e6d82c08c9" Oct 02 09:36:31 crc kubenswrapper[4722]: E1002 09:36:31.263210 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddc867f020c3b2e2572582c66e91c6fe43b09239cb1802495704d5e6d82c08c9\": container with ID starting with ddc867f020c3b2e2572582c66e91c6fe43b09239cb1802495704d5e6d82c08c9 not found: ID does not exist" containerID="ddc867f020c3b2e2572582c66e91c6fe43b09239cb1802495704d5e6d82c08c9" Oct 02 09:36:31 crc kubenswrapper[4722]: I1002 09:36:31.263265 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddc867f020c3b2e2572582c66e91c6fe43b09239cb1802495704d5e6d82c08c9"} err="failed to get container status \"ddc867f020c3b2e2572582c66e91c6fe43b09239cb1802495704d5e6d82c08c9\": rpc error: code = NotFound desc = could not find container \"ddc867f020c3b2e2572582c66e91c6fe43b09239cb1802495704d5e6d82c08c9\": container with ID starting with ddc867f020c3b2e2572582c66e91c6fe43b09239cb1802495704d5e6d82c08c9 not found: ID does not exist" Oct 02 09:36:31 crc kubenswrapper[4722]: I1002 09:36:31.263307 4722 scope.go:117] "RemoveContainer" containerID="048acbc67111768e95c693255cf3f737daf9bd8bcf773d4f30d77e8ae7c5ad79" Oct 02 09:36:31 crc kubenswrapper[4722]: E1002 09:36:31.263657 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"048acbc67111768e95c693255cf3f737daf9bd8bcf773d4f30d77e8ae7c5ad79\": container with ID starting with 048acbc67111768e95c693255cf3f737daf9bd8bcf773d4f30d77e8ae7c5ad79 not found: ID does not exist" containerID="048acbc67111768e95c693255cf3f737daf9bd8bcf773d4f30d77e8ae7c5ad79" Oct 02 09:36:31 crc kubenswrapper[4722]: I1002 09:36:31.263699 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"048acbc67111768e95c693255cf3f737daf9bd8bcf773d4f30d77e8ae7c5ad79"} err="failed to get container status \"048acbc67111768e95c693255cf3f737daf9bd8bcf773d4f30d77e8ae7c5ad79\": rpc error: code = NotFound desc = could not find container \"048acbc67111768e95c693255cf3f737daf9bd8bcf773d4f30d77e8ae7c5ad79\": container with ID starting with 048acbc67111768e95c693255cf3f737daf9bd8bcf773d4f30d77e8ae7c5ad79 not found: ID does not exist" Oct 02 09:36:31 crc kubenswrapper[4722]: I1002 09:36:31.263726 4722 scope.go:117] "RemoveContainer" containerID="f906d18d2c1bdedaa8e9e988a10ecd0dc857be4917160e961e04c83fc9b01e55" Oct 02 09:36:31 crc kubenswrapper[4722]: E1002 09:36:31.264116 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f906d18d2c1bdedaa8e9e988a10ecd0dc857be4917160e961e04c83fc9b01e55\": container with ID starting with f906d18d2c1bdedaa8e9e988a10ecd0dc857be4917160e961e04c83fc9b01e55 not found: ID does not exist" containerID="f906d18d2c1bdedaa8e9e988a10ecd0dc857be4917160e961e04c83fc9b01e55" Oct 02 09:36:31 crc kubenswrapper[4722]: I1002 09:36:31.264136 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f906d18d2c1bdedaa8e9e988a10ecd0dc857be4917160e961e04c83fc9b01e55"} err="failed to get container status \"f906d18d2c1bdedaa8e9e988a10ecd0dc857be4917160e961e04c83fc9b01e55\": rpc error: code = NotFound desc = could not find container \"f906d18d2c1bdedaa8e9e988a10ecd0dc857be4917160e961e04c83fc9b01e55\": container with ID starting with f906d18d2c1bdedaa8e9e988a10ecd0dc857be4917160e961e04c83fc9b01e55 not found: ID does not exist" Oct 02 09:36:32 crc kubenswrapper[4722]: I1002 09:36:32.727456 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea335a73-5af7-4bd2-8439-cfa2304b594e" path="/var/lib/kubelet/pods/ea335a73-5af7-4bd2-8439-cfa2304b594e/volumes" Oct 02 09:36:36 crc kubenswrapper[4722]: I1002 09:36:36.260821 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-p9hf8" Oct 02 09:36:36 crc kubenswrapper[4722]: I1002 09:36:36.261193 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p9hf8" Oct 02 09:36:36 crc kubenswrapper[4722]: I1002 09:36:36.332366 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-p9hf8" Oct 02 09:36:37 crc kubenswrapper[4722]: I1002 09:36:37.265624 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-p9hf8" Oct 02 09:36:38 crc kubenswrapper[4722]: I1002 09:36:38.370877 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x8b9l" Oct 02 09:36:38 crc kubenswrapper[4722]: I1002 09:36:38.372517 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x8b9l" Oct 02 09:36:38 crc kubenswrapper[4722]: I1002 09:36:38.424768 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x8b9l" Oct 02 09:36:38 crc kubenswrapper[4722]: I1002 09:36:38.828753 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vpgk2" Oct 02 09:36:38 crc kubenswrapper[4722]: I1002 09:36:38.828819 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vpgk2" Oct 02 09:36:38 crc kubenswrapper[4722]: I1002 09:36:38.866867 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vpgk2" Oct 02 09:36:39 crc kubenswrapper[4722]: I1002 09:36:39.278638 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x8b9l" Oct 02 09:36:39 crc kubenswrapper[4722]: I1002 09:36:39.281339 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vpgk2" Oct 02 09:36:39 crc kubenswrapper[4722]: I1002 09:36:39.678506 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vpgk2"] Oct 02 09:36:39 crc kubenswrapper[4722]: I1002 09:36:39.787380 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xl585" Oct 02 09:36:39 crc kubenswrapper[4722]: I1002 09:36:39.787424 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xl585" Oct 02 09:36:39 crc kubenswrapper[4722]: I1002 09:36:39.825510 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xl585" Oct 02 09:36:40 crc kubenswrapper[4722]: I1002 09:36:40.281127 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xl585" Oct 02 09:36:41 crc kubenswrapper[4722]: I1002 09:36:41.243609 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vpgk2" podUID="52ea146c-d933-4406-985c-7ec1c9386710" containerName="registry-server" containerID="cri-o://ea1b574e997b322a13e0caff4e95eea6a88eb8fac5cdb1cc0bcbea1d4710d054" gracePeriod=2 Oct 02 09:36:42 crc kubenswrapper[4722]: I1002 09:36:42.083481 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xl585"] Oct 02 09:36:42 crc kubenswrapper[4722]: I1002 09:36:42.266633 4722 generic.go:334] "Generic (PLEG): container finished" podID="52ea146c-d933-4406-985c-7ec1c9386710" containerID="ea1b574e997b322a13e0caff4e95eea6a88eb8fac5cdb1cc0bcbea1d4710d054" exitCode=0 Oct 02 09:36:42 crc kubenswrapper[4722]: I1002 09:36:42.266909 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vpgk2" event={"ID":"52ea146c-d933-4406-985c-7ec1c9386710","Type":"ContainerDied","Data":"ea1b574e997b322a13e0caff4e95eea6a88eb8fac5cdb1cc0bcbea1d4710d054"} Oct 02 09:36:42 crc kubenswrapper[4722]: I1002 09:36:42.267380 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xl585" podUID="463ed397-56f9-4b1b-b32b-4d2204f7632a" containerName="registry-server" containerID="cri-o://6da6ea9f01b78b85f3f4fd838c10ce01336f16d170590c1e3804666b832efe58" gracePeriod=2 Oct 02 09:36:43 crc kubenswrapper[4722]: I1002 09:36:43.124458 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vpgk2" Oct 02 09:36:43 crc kubenswrapper[4722]: I1002 09:36:43.276102 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52ea146c-d933-4406-985c-7ec1c9386710-utilities\") pod \"52ea146c-d933-4406-985c-7ec1c9386710\" (UID: \"52ea146c-d933-4406-985c-7ec1c9386710\") " Oct 02 09:36:43 crc kubenswrapper[4722]: I1002 09:36:43.276337 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52ea146c-d933-4406-985c-7ec1c9386710-catalog-content\") pod \"52ea146c-d933-4406-985c-7ec1c9386710\" (UID: \"52ea146c-d933-4406-985c-7ec1c9386710\") " Oct 02 09:36:43 crc kubenswrapper[4722]: I1002 09:36:43.276485 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kgkb\" (UniqueName: \"kubernetes.io/projected/52ea146c-d933-4406-985c-7ec1c9386710-kube-api-access-4kgkb\") pod \"52ea146c-d933-4406-985c-7ec1c9386710\" (UID: \"52ea146c-d933-4406-985c-7ec1c9386710\") " Oct 02 09:36:43 crc kubenswrapper[4722]: I1002 09:36:43.277833 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52ea146c-d933-4406-985c-7ec1c9386710-utilities" (OuterVolumeSpecName: "utilities") pod "52ea146c-d933-4406-985c-7ec1c9386710" (UID: "52ea146c-d933-4406-985c-7ec1c9386710"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:36:43 crc kubenswrapper[4722]: I1002 09:36:43.280302 4722 generic.go:334] "Generic (PLEG): container finished" podID="463ed397-56f9-4b1b-b32b-4d2204f7632a" containerID="6da6ea9f01b78b85f3f4fd838c10ce01336f16d170590c1e3804666b832efe58" exitCode=0 Oct 02 09:36:43 crc kubenswrapper[4722]: I1002 09:36:43.280388 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xl585" event={"ID":"463ed397-56f9-4b1b-b32b-4d2204f7632a","Type":"ContainerDied","Data":"6da6ea9f01b78b85f3f4fd838c10ce01336f16d170590c1e3804666b832efe58"} Oct 02 09:36:43 crc kubenswrapper[4722]: I1002 09:36:43.283457 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52ea146c-d933-4406-985c-7ec1c9386710-kube-api-access-4kgkb" (OuterVolumeSpecName: "kube-api-access-4kgkb") pod "52ea146c-d933-4406-985c-7ec1c9386710" (UID: "52ea146c-d933-4406-985c-7ec1c9386710"). InnerVolumeSpecName "kube-api-access-4kgkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:36:43 crc kubenswrapper[4722]: I1002 09:36:43.284548 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vpgk2" event={"ID":"52ea146c-d933-4406-985c-7ec1c9386710","Type":"ContainerDied","Data":"b764963c98d4108da8b9aeb11bb405ec97f136abfc3a1d8d9751097e7fd9e2ae"} Oct 02 09:36:43 crc kubenswrapper[4722]: I1002 09:36:43.284603 4722 scope.go:117] "RemoveContainer" containerID="ea1b574e997b322a13e0caff4e95eea6a88eb8fac5cdb1cc0bcbea1d4710d054" Oct 02 09:36:43 crc kubenswrapper[4722]: I1002 09:36:43.284747 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vpgk2" Oct 02 09:36:43 crc kubenswrapper[4722]: I1002 09:36:43.290567 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52ea146c-d933-4406-985c-7ec1c9386710-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "52ea146c-d933-4406-985c-7ec1c9386710" (UID: "52ea146c-d933-4406-985c-7ec1c9386710"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:36:43 crc kubenswrapper[4722]: I1002 09:36:43.307806 4722 scope.go:117] "RemoveContainer" containerID="2503513ae58c1b52627c05b1f42c59569a3732e552438906acb868a959b269ed" Oct 02 09:36:43 crc kubenswrapper[4722]: I1002 09:36:43.323301 4722 scope.go:117] "RemoveContainer" containerID="c9ad4e02298759548a4bb4d2836d9a1b8b01b3b23e5609f3d28ce5162b049a62" Oct 02 09:36:43 crc kubenswrapper[4722]: I1002 09:36:43.348335 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xl585" Oct 02 09:36:43 crc kubenswrapper[4722]: I1002 09:36:43.382152 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kgkb\" (UniqueName: \"kubernetes.io/projected/52ea146c-d933-4406-985c-7ec1c9386710-kube-api-access-4kgkb\") on node \"crc\" DevicePath \"\"" Oct 02 09:36:43 crc kubenswrapper[4722]: I1002 09:36:43.382197 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52ea146c-d933-4406-985c-7ec1c9386710-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 09:36:43 crc kubenswrapper[4722]: I1002 09:36:43.382207 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52ea146c-d933-4406-985c-7ec1c9386710-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 09:36:43 crc kubenswrapper[4722]: I1002 09:36:43.483036 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/463ed397-56f9-4b1b-b32b-4d2204f7632a-catalog-content\") pod \"463ed397-56f9-4b1b-b32b-4d2204f7632a\" (UID: \"463ed397-56f9-4b1b-b32b-4d2204f7632a\") " Oct 02 09:36:43 crc kubenswrapper[4722]: I1002 09:36:43.483202 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5cff\" (UniqueName: \"kubernetes.io/projected/463ed397-56f9-4b1b-b32b-4d2204f7632a-kube-api-access-h5cff\") pod \"463ed397-56f9-4b1b-b32b-4d2204f7632a\" (UID: \"463ed397-56f9-4b1b-b32b-4d2204f7632a\") " Oct 02 09:36:43 crc kubenswrapper[4722]: I1002 09:36:43.483235 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/463ed397-56f9-4b1b-b32b-4d2204f7632a-utilities\") pod \"463ed397-56f9-4b1b-b32b-4d2204f7632a\" (UID: \"463ed397-56f9-4b1b-b32b-4d2204f7632a\") " Oct 02 09:36:43 crc kubenswrapper[4722]: I1002 09:36:43.484199 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/463ed397-56f9-4b1b-b32b-4d2204f7632a-utilities" (OuterVolumeSpecName: "utilities") pod "463ed397-56f9-4b1b-b32b-4d2204f7632a" (UID: "463ed397-56f9-4b1b-b32b-4d2204f7632a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:36:43 crc kubenswrapper[4722]: I1002 09:36:43.487276 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/463ed397-56f9-4b1b-b32b-4d2204f7632a-kube-api-access-h5cff" (OuterVolumeSpecName: "kube-api-access-h5cff") pod "463ed397-56f9-4b1b-b32b-4d2204f7632a" (UID: "463ed397-56f9-4b1b-b32b-4d2204f7632a"). InnerVolumeSpecName "kube-api-access-h5cff". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:36:43 crc kubenswrapper[4722]: I1002 09:36:43.584996 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5cff\" (UniqueName: \"kubernetes.io/projected/463ed397-56f9-4b1b-b32b-4d2204f7632a-kube-api-access-h5cff\") on node \"crc\" DevicePath \"\"" Oct 02 09:36:43 crc kubenswrapper[4722]: I1002 09:36:43.585080 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/463ed397-56f9-4b1b-b32b-4d2204f7632a-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 09:36:43 crc kubenswrapper[4722]: I1002 09:36:43.625631 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vpgk2"] Oct 02 09:36:43 crc kubenswrapper[4722]: I1002 09:36:43.626208 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vpgk2"] Oct 02 09:36:43 crc kubenswrapper[4722]: I1002 09:36:43.767424 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/463ed397-56f9-4b1b-b32b-4d2204f7632a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "463ed397-56f9-4b1b-b32b-4d2204f7632a" (UID: "463ed397-56f9-4b1b-b32b-4d2204f7632a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:36:43 crc kubenswrapper[4722]: I1002 09:36:43.788634 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/463ed397-56f9-4b1b-b32b-4d2204f7632a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 09:36:44 crc kubenswrapper[4722]: I1002 09:36:44.294140 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xl585" event={"ID":"463ed397-56f9-4b1b-b32b-4d2204f7632a","Type":"ContainerDied","Data":"f82b764fba18e946edde77a4f7f9e4c7790f2dcf078b7649dd6c90393bad9cd9"} Oct 02 09:36:44 crc kubenswrapper[4722]: I1002 09:36:44.294203 4722 scope.go:117] "RemoveContainer" containerID="6da6ea9f01b78b85f3f4fd838c10ce01336f16d170590c1e3804666b832efe58" Oct 02 09:36:44 crc kubenswrapper[4722]: I1002 09:36:44.294268 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xl585" Oct 02 09:36:44 crc kubenswrapper[4722]: I1002 09:36:44.314137 4722 scope.go:117] "RemoveContainer" containerID="535b0b1e0c8955415776fe3b5f95596c255019f934e344a06ca276193fa278bb" Oct 02 09:36:44 crc kubenswrapper[4722]: I1002 09:36:44.333795 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xl585"] Oct 02 09:36:44 crc kubenswrapper[4722]: I1002 09:36:44.336465 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xl585"] Oct 02 09:36:44 crc kubenswrapper[4722]: I1002 09:36:44.347132 4722 scope.go:117] "RemoveContainer" containerID="46b48a63d837cf2e8b10b0538a76c4a76f6aa8043992352de619fa26969077d7" Oct 02 09:36:44 crc kubenswrapper[4722]: I1002 09:36:44.717286 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="463ed397-56f9-4b1b-b32b-4d2204f7632a" path="/var/lib/kubelet/pods/463ed397-56f9-4b1b-b32b-4d2204f7632a/volumes" Oct 02 09:36:44 crc kubenswrapper[4722]: I1002 09:36:44.718498 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52ea146c-d933-4406-985c-7ec1c9386710" path="/var/lib/kubelet/pods/52ea146c-d933-4406-985c-7ec1c9386710/volumes" Oct 02 09:37:18 crc kubenswrapper[4722]: I1002 09:37:18.519907 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ljxl7"] Oct 02 09:37:43 crc kubenswrapper[4722]: I1002 09:37:43.572867 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" podUID="db4cb93f-b00a-42b1-b5d2-704e6c6010c6" containerName="oauth-openshift" containerID="cri-o://079886b127f56364cb4f4a2af4404f3c7d2b812c2f7a3bde1b68aee6707705ba" gracePeriod=15 Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.081617 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.125557 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2"] Oct 02 09:37:44 crc kubenswrapper[4722]: E1002 09:37:44.125906 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d30e5a09-5492-426b-89a9-2331cce3cc87" containerName="collect-profiles" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.125937 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d30e5a09-5492-426b-89a9-2331cce3cc87" containerName="collect-profiles" Oct 02 09:37:44 crc kubenswrapper[4722]: E1002 09:37:44.125962 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52ea146c-d933-4406-985c-7ec1c9386710" containerName="extract-content" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.125974 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="52ea146c-d933-4406-985c-7ec1c9386710" containerName="extract-content" Oct 02 09:37:44 crc kubenswrapper[4722]: E1002 09:37:44.125992 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea335a73-5af7-4bd2-8439-cfa2304b594e" containerName="extract-utilities" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.126002 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea335a73-5af7-4bd2-8439-cfa2304b594e" containerName="extract-utilities" Oct 02 09:37:44 crc kubenswrapper[4722]: E1002 09:37:44.126019 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="463ed397-56f9-4b1b-b32b-4d2204f7632a" containerName="registry-server" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.126032 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="463ed397-56f9-4b1b-b32b-4d2204f7632a" containerName="registry-server" Oct 02 09:37:44 crc kubenswrapper[4722]: E1002 09:37:44.126051 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="463ed397-56f9-4b1b-b32b-4d2204f7632a" containerName="extract-content" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.126063 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="463ed397-56f9-4b1b-b32b-4d2204f7632a" containerName="extract-content" Oct 02 09:37:44 crc kubenswrapper[4722]: E1002 09:37:44.126077 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52ea146c-d933-4406-985c-7ec1c9386710" containerName="extract-utilities" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.126086 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="52ea146c-d933-4406-985c-7ec1c9386710" containerName="extract-utilities" Oct 02 09:37:44 crc kubenswrapper[4722]: E1002 09:37:44.126099 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1581831-b357-4fe6-8e8b-5bdbc057f867" containerName="extract-content" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.126109 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1581831-b357-4fe6-8e8b-5bdbc057f867" containerName="extract-content" Oct 02 09:37:44 crc kubenswrapper[4722]: E1002 09:37:44.126125 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="463ed397-56f9-4b1b-b32b-4d2204f7632a" containerName="extract-utilities" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.126134 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="463ed397-56f9-4b1b-b32b-4d2204f7632a" containerName="extract-utilities" Oct 02 09:37:44 crc kubenswrapper[4722]: E1002 09:37:44.126151 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1581831-b357-4fe6-8e8b-5bdbc057f867" containerName="registry-server" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.126162 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1581831-b357-4fe6-8e8b-5bdbc057f867" containerName="registry-server" Oct 02 09:37:44 crc kubenswrapper[4722]: E1002 09:37:44.126177 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74b4c583-6def-4600-8894-261e875642ab" containerName="pruner" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.126188 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="74b4c583-6def-4600-8894-261e875642ab" containerName="pruner" Oct 02 09:37:44 crc kubenswrapper[4722]: E1002 09:37:44.126201 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db4cb93f-b00a-42b1-b5d2-704e6c6010c6" containerName="oauth-openshift" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.126214 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="db4cb93f-b00a-42b1-b5d2-704e6c6010c6" containerName="oauth-openshift" Oct 02 09:37:44 crc kubenswrapper[4722]: E1002 09:37:44.126228 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1581831-b357-4fe6-8e8b-5bdbc057f867" containerName="extract-utilities" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.126239 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1581831-b357-4fe6-8e8b-5bdbc057f867" containerName="extract-utilities" Oct 02 09:37:44 crc kubenswrapper[4722]: E1002 09:37:44.126255 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea335a73-5af7-4bd2-8439-cfa2304b594e" containerName="registry-server" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.126268 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea335a73-5af7-4bd2-8439-cfa2304b594e" containerName="registry-server" Oct 02 09:37:44 crc kubenswrapper[4722]: E1002 09:37:44.126281 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52ea146c-d933-4406-985c-7ec1c9386710" containerName="registry-server" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.126292 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="52ea146c-d933-4406-985c-7ec1c9386710" containerName="registry-server" Oct 02 09:37:44 crc kubenswrapper[4722]: E1002 09:37:44.126345 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dc19c16-fd8f-4ec7-9066-b18aff12d731" containerName="pruner" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.126359 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dc19c16-fd8f-4ec7-9066-b18aff12d731" containerName="pruner" Oct 02 09:37:44 crc kubenswrapper[4722]: E1002 09:37:44.126375 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea335a73-5af7-4bd2-8439-cfa2304b594e" containerName="extract-content" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.126388 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea335a73-5af7-4bd2-8439-cfa2304b594e" containerName="extract-content" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.126564 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="74b4c583-6def-4600-8894-261e875642ab" containerName="pruner" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.126586 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="52ea146c-d933-4406-985c-7ec1c9386710" containerName="registry-server" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.126603 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1581831-b357-4fe6-8e8b-5bdbc057f867" containerName="registry-server" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.126617 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="463ed397-56f9-4b1b-b32b-4d2204f7632a" containerName="registry-server" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.126630 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="db4cb93f-b00a-42b1-b5d2-704e6c6010c6" containerName="oauth-openshift" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.126642 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dc19c16-fd8f-4ec7-9066-b18aff12d731" containerName="pruner" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.126655 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea335a73-5af7-4bd2-8439-cfa2304b594e" containerName="registry-server" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.126672 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d30e5a09-5492-426b-89a9-2331cce3cc87" containerName="collect-profiles" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.127405 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.146046 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2"] Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.226796 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-system-session\") pod \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.226861 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-system-service-ca\") pod \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.227257 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-audit-dir\") pod \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.227418 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-system-router-certs\") pod \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.227570 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-system-serving-cert\") pod \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.227700 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-user-idp-0-file-data\") pod \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.227796 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjqnv\" (UniqueName: \"kubernetes.io/projected/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-kube-api-access-qjqnv\") pod \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.227898 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-system-ocp-branding-template\") pod \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.228005 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-audit-policies\") pod \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.227350 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "db4cb93f-b00a-42b1-b5d2-704e6c6010c6" (UID: "db4cb93f-b00a-42b1-b5d2-704e6c6010c6"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.228344 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-user-template-provider-selection\") pod \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.227810 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "db4cb93f-b00a-42b1-b5d2-704e6c6010c6" (UID: "db4cb93f-b00a-42b1-b5d2-704e6c6010c6"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.228449 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-user-template-login\") pod \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.228489 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-system-trusted-ca-bundle\") pod \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.228529 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-system-cliconfig\") pod \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.228558 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-user-template-error\") pod \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\" (UID: \"db4cb93f-b00a-42b1-b5d2-704e6c6010c6\") " Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.228691 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ec232244-6702-4a68-bbb4-3d9fa7a9a2eb-v4-0-config-system-service-ca\") pod \"oauth-openshift-5f96bbd69c-j7nn2\" (UID: \"ec232244-6702-4a68-bbb4-3d9fa7a9a2eb\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.228483 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "db4cb93f-b00a-42b1-b5d2-704e6c6010c6" (UID: "db4cb93f-b00a-42b1-b5d2-704e6c6010c6"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.228722 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec232244-6702-4a68-bbb4-3d9fa7a9a2eb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5f96bbd69c-j7nn2\" (UID: \"ec232244-6702-4a68-bbb4-3d9fa7a9a2eb\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.228995 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ec232244-6702-4a68-bbb4-3d9fa7a9a2eb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5f96bbd69c-j7nn2\" (UID: \"ec232244-6702-4a68-bbb4-3d9fa7a9a2eb\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.229119 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ec232244-6702-4a68-bbb4-3d9fa7a9a2eb-v4-0-config-user-template-login\") pod \"oauth-openshift-5f96bbd69c-j7nn2\" (UID: \"ec232244-6702-4a68-bbb4-3d9fa7a9a2eb\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.229170 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ec232244-6702-4a68-bbb4-3d9fa7a9a2eb-audit-dir\") pod \"oauth-openshift-5f96bbd69c-j7nn2\" (UID: \"ec232244-6702-4a68-bbb4-3d9fa7a9a2eb\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.229203 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ec232244-6702-4a68-bbb4-3d9fa7a9a2eb-v4-0-config-system-session\") pod \"oauth-openshift-5f96bbd69c-j7nn2\" (UID: \"ec232244-6702-4a68-bbb4-3d9fa7a9a2eb\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.229252 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ec232244-6702-4a68-bbb4-3d9fa7a9a2eb-v4-0-config-user-template-error\") pod \"oauth-openshift-5f96bbd69c-j7nn2\" (UID: \"ec232244-6702-4a68-bbb4-3d9fa7a9a2eb\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.229305 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "db4cb93f-b00a-42b1-b5d2-704e6c6010c6" (UID: "db4cb93f-b00a-42b1-b5d2-704e6c6010c6"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.229357 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ec232244-6702-4a68-bbb4-3d9fa7a9a2eb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5f96bbd69c-j7nn2\" (UID: \"ec232244-6702-4a68-bbb4-3d9fa7a9a2eb\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.229474 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ec232244-6702-4a68-bbb4-3d9fa7a9a2eb-v4-0-config-system-router-certs\") pod \"oauth-openshift-5f96bbd69c-j7nn2\" (UID: \"ec232244-6702-4a68-bbb4-3d9fa7a9a2eb\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.229644 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rprs\" (UniqueName: \"kubernetes.io/projected/ec232244-6702-4a68-bbb4-3d9fa7a9a2eb-kube-api-access-9rprs\") pod \"oauth-openshift-5f96bbd69c-j7nn2\" (UID: \"ec232244-6702-4a68-bbb4-3d9fa7a9a2eb\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.229704 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ec232244-6702-4a68-bbb4-3d9fa7a9a2eb-audit-policies\") pod \"oauth-openshift-5f96bbd69c-j7nn2\" (UID: \"ec232244-6702-4a68-bbb4-3d9fa7a9a2eb\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.229739 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ec232244-6702-4a68-bbb4-3d9fa7a9a2eb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5f96bbd69c-j7nn2\" (UID: \"ec232244-6702-4a68-bbb4-3d9fa7a9a2eb\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.229771 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ec232244-6702-4a68-bbb4-3d9fa7a9a2eb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5f96bbd69c-j7nn2\" (UID: \"ec232244-6702-4a68-bbb4-3d9fa7a9a2eb\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.229801 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ec232244-6702-4a68-bbb4-3d9fa7a9a2eb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5f96bbd69c-j7nn2\" (UID: \"ec232244-6702-4a68-bbb4-3d9fa7a9a2eb\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.229884 4722 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.229905 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.229920 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.229935 4722 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.230290 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "db4cb93f-b00a-42b1-b5d2-704e6c6010c6" (UID: "db4cb93f-b00a-42b1-b5d2-704e6c6010c6"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.234680 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "db4cb93f-b00a-42b1-b5d2-704e6c6010c6" (UID: "db4cb93f-b00a-42b1-b5d2-704e6c6010c6"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.235231 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-kube-api-access-qjqnv" (OuterVolumeSpecName: "kube-api-access-qjqnv") pod "db4cb93f-b00a-42b1-b5d2-704e6c6010c6" (UID: "db4cb93f-b00a-42b1-b5d2-704e6c6010c6"). InnerVolumeSpecName "kube-api-access-qjqnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.235388 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "db4cb93f-b00a-42b1-b5d2-704e6c6010c6" (UID: "db4cb93f-b00a-42b1-b5d2-704e6c6010c6"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.235605 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "db4cb93f-b00a-42b1-b5d2-704e6c6010c6" (UID: "db4cb93f-b00a-42b1-b5d2-704e6c6010c6"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.235940 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "db4cb93f-b00a-42b1-b5d2-704e6c6010c6" (UID: "db4cb93f-b00a-42b1-b5d2-704e6c6010c6"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.236074 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "db4cb93f-b00a-42b1-b5d2-704e6c6010c6" (UID: "db4cb93f-b00a-42b1-b5d2-704e6c6010c6"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.236202 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "db4cb93f-b00a-42b1-b5d2-704e6c6010c6" (UID: "db4cb93f-b00a-42b1-b5d2-704e6c6010c6"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.236380 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "db4cb93f-b00a-42b1-b5d2-704e6c6010c6" (UID: "db4cb93f-b00a-42b1-b5d2-704e6c6010c6"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.238959 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "db4cb93f-b00a-42b1-b5d2-704e6c6010c6" (UID: "db4cb93f-b00a-42b1-b5d2-704e6c6010c6"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.331209 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ec232244-6702-4a68-bbb4-3d9fa7a9a2eb-v4-0-config-system-service-ca\") pod \"oauth-openshift-5f96bbd69c-j7nn2\" (UID: \"ec232244-6702-4a68-bbb4-3d9fa7a9a2eb\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.331261 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec232244-6702-4a68-bbb4-3d9fa7a9a2eb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5f96bbd69c-j7nn2\" (UID: \"ec232244-6702-4a68-bbb4-3d9fa7a9a2eb\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.331290 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ec232244-6702-4a68-bbb4-3d9fa7a9a2eb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5f96bbd69c-j7nn2\" (UID: \"ec232244-6702-4a68-bbb4-3d9fa7a9a2eb\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.331341 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ec232244-6702-4a68-bbb4-3d9fa7a9a2eb-v4-0-config-user-template-login\") pod \"oauth-openshift-5f96bbd69c-j7nn2\" (UID: \"ec232244-6702-4a68-bbb4-3d9fa7a9a2eb\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.331371 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ec232244-6702-4a68-bbb4-3d9fa7a9a2eb-audit-dir\") pod \"oauth-openshift-5f96bbd69c-j7nn2\" (UID: \"ec232244-6702-4a68-bbb4-3d9fa7a9a2eb\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.331392 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ec232244-6702-4a68-bbb4-3d9fa7a9a2eb-v4-0-config-system-session\") pod \"oauth-openshift-5f96bbd69c-j7nn2\" (UID: \"ec232244-6702-4a68-bbb4-3d9fa7a9a2eb\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.331419 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ec232244-6702-4a68-bbb4-3d9fa7a9a2eb-v4-0-config-user-template-error\") pod \"oauth-openshift-5f96bbd69c-j7nn2\" (UID: \"ec232244-6702-4a68-bbb4-3d9fa7a9a2eb\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.331449 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ec232244-6702-4a68-bbb4-3d9fa7a9a2eb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5f96bbd69c-j7nn2\" (UID: \"ec232244-6702-4a68-bbb4-3d9fa7a9a2eb\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.331476 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ec232244-6702-4a68-bbb4-3d9fa7a9a2eb-v4-0-config-system-router-certs\") pod \"oauth-openshift-5f96bbd69c-j7nn2\" (UID: \"ec232244-6702-4a68-bbb4-3d9fa7a9a2eb\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.331505 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rprs\" (UniqueName: \"kubernetes.io/projected/ec232244-6702-4a68-bbb4-3d9fa7a9a2eb-kube-api-access-9rprs\") pod \"oauth-openshift-5f96bbd69c-j7nn2\" (UID: \"ec232244-6702-4a68-bbb4-3d9fa7a9a2eb\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.331527 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ec232244-6702-4a68-bbb4-3d9fa7a9a2eb-audit-policies\") pod \"oauth-openshift-5f96bbd69c-j7nn2\" (UID: \"ec232244-6702-4a68-bbb4-3d9fa7a9a2eb\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.331556 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ec232244-6702-4a68-bbb4-3d9fa7a9a2eb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5f96bbd69c-j7nn2\" (UID: \"ec232244-6702-4a68-bbb4-3d9fa7a9a2eb\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.331585 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ec232244-6702-4a68-bbb4-3d9fa7a9a2eb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5f96bbd69c-j7nn2\" (UID: \"ec232244-6702-4a68-bbb4-3d9fa7a9a2eb\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.331610 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ec232244-6702-4a68-bbb4-3d9fa7a9a2eb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5f96bbd69c-j7nn2\" (UID: \"ec232244-6702-4a68-bbb4-3d9fa7a9a2eb\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.331690 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.331708 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.331720 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.331733 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjqnv\" (UniqueName: \"kubernetes.io/projected/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-kube-api-access-qjqnv\") on node \"crc\" DevicePath \"\"" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.331745 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.331765 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.331777 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.331789 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.331801 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.331816 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/db4cb93f-b00a-42b1-b5d2-704e6c6010c6-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.332169 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ec232244-6702-4a68-bbb4-3d9fa7a9a2eb-audit-dir\") pod \"oauth-openshift-5f96bbd69c-j7nn2\" (UID: \"ec232244-6702-4a68-bbb4-3d9fa7a9a2eb\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.332295 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ec232244-6702-4a68-bbb4-3d9fa7a9a2eb-v4-0-config-system-service-ca\") pod \"oauth-openshift-5f96bbd69c-j7nn2\" (UID: \"ec232244-6702-4a68-bbb4-3d9fa7a9a2eb\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.332665 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec232244-6702-4a68-bbb4-3d9fa7a9a2eb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5f96bbd69c-j7nn2\" (UID: \"ec232244-6702-4a68-bbb4-3d9fa7a9a2eb\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.333221 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ec232244-6702-4a68-bbb4-3d9fa7a9a2eb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5f96bbd69c-j7nn2\" (UID: \"ec232244-6702-4a68-bbb4-3d9fa7a9a2eb\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.333395 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ec232244-6702-4a68-bbb4-3d9fa7a9a2eb-audit-policies\") pod \"oauth-openshift-5f96bbd69c-j7nn2\" (UID: \"ec232244-6702-4a68-bbb4-3d9fa7a9a2eb\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.336714 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ec232244-6702-4a68-bbb4-3d9fa7a9a2eb-v4-0-config-system-session\") pod \"oauth-openshift-5f96bbd69c-j7nn2\" (UID: \"ec232244-6702-4a68-bbb4-3d9fa7a9a2eb\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.336786 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ec232244-6702-4a68-bbb4-3d9fa7a9a2eb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5f96bbd69c-j7nn2\" (UID: \"ec232244-6702-4a68-bbb4-3d9fa7a9a2eb\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.336852 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ec232244-6702-4a68-bbb4-3d9fa7a9a2eb-v4-0-config-system-router-certs\") pod \"oauth-openshift-5f96bbd69c-j7nn2\" (UID: \"ec232244-6702-4a68-bbb4-3d9fa7a9a2eb\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.337007 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ec232244-6702-4a68-bbb4-3d9fa7a9a2eb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5f96bbd69c-j7nn2\" (UID: \"ec232244-6702-4a68-bbb4-3d9fa7a9a2eb\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.337626 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ec232244-6702-4a68-bbb4-3d9fa7a9a2eb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5f96bbd69c-j7nn2\" (UID: \"ec232244-6702-4a68-bbb4-3d9fa7a9a2eb\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.337752 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ec232244-6702-4a68-bbb4-3d9fa7a9a2eb-v4-0-config-user-template-login\") pod \"oauth-openshift-5f96bbd69c-j7nn2\" (UID: \"ec232244-6702-4a68-bbb4-3d9fa7a9a2eb\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.337953 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ec232244-6702-4a68-bbb4-3d9fa7a9a2eb-v4-0-config-user-template-error\") pod \"oauth-openshift-5f96bbd69c-j7nn2\" (UID: \"ec232244-6702-4a68-bbb4-3d9fa7a9a2eb\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.338761 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ec232244-6702-4a68-bbb4-3d9fa7a9a2eb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5f96bbd69c-j7nn2\" (UID: \"ec232244-6702-4a68-bbb4-3d9fa7a9a2eb\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.353405 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rprs\" (UniqueName: \"kubernetes.io/projected/ec232244-6702-4a68-bbb4-3d9fa7a9a2eb-kube-api-access-9rprs\") pod \"oauth-openshift-5f96bbd69c-j7nn2\" (UID: \"ec232244-6702-4a68-bbb4-3d9fa7a9a2eb\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.448962 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.623695 4722 generic.go:334] "Generic (PLEG): container finished" podID="db4cb93f-b00a-42b1-b5d2-704e6c6010c6" containerID="079886b127f56364cb4f4a2af4404f3c7d2b812c2f7a3bde1b68aee6707705ba" exitCode=0 Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.623785 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.623771 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" event={"ID":"db4cb93f-b00a-42b1-b5d2-704e6c6010c6","Type":"ContainerDied","Data":"079886b127f56364cb4f4a2af4404f3c7d2b812c2f7a3bde1b68aee6707705ba"} Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.623874 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ljxl7" event={"ID":"db4cb93f-b00a-42b1-b5d2-704e6c6010c6","Type":"ContainerDied","Data":"5e3092471504a86095de8d2e8b496d377696a22976e94ff192c7b421077a1a3d"} Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.623907 4722 scope.go:117] "RemoveContainer" containerID="079886b127f56364cb4f4a2af4404f3c7d2b812c2f7a3bde1b68aee6707705ba" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.636246 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2"] Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.658256 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ljxl7"] Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.664014 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ljxl7"] Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.665554 4722 scope.go:117] "RemoveContainer" containerID="079886b127f56364cb4f4a2af4404f3c7d2b812c2f7a3bde1b68aee6707705ba" Oct 02 09:37:44 crc kubenswrapper[4722]: E1002 09:37:44.666695 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"079886b127f56364cb4f4a2af4404f3c7d2b812c2f7a3bde1b68aee6707705ba\": container with ID starting with 079886b127f56364cb4f4a2af4404f3c7d2b812c2f7a3bde1b68aee6707705ba not found: ID does not exist" containerID="079886b127f56364cb4f4a2af4404f3c7d2b812c2f7a3bde1b68aee6707705ba" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.666754 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"079886b127f56364cb4f4a2af4404f3c7d2b812c2f7a3bde1b68aee6707705ba"} err="failed to get container status \"079886b127f56364cb4f4a2af4404f3c7d2b812c2f7a3bde1b68aee6707705ba\": rpc error: code = NotFound desc = could not find container \"079886b127f56364cb4f4a2af4404f3c7d2b812c2f7a3bde1b68aee6707705ba\": container with ID starting with 079886b127f56364cb4f4a2af4404f3c7d2b812c2f7a3bde1b68aee6707705ba not found: ID does not exist" Oct 02 09:37:44 crc kubenswrapper[4722]: I1002 09:37:44.717170 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db4cb93f-b00a-42b1-b5d2-704e6c6010c6" path="/var/lib/kubelet/pods/db4cb93f-b00a-42b1-b5d2-704e6c6010c6/volumes" Oct 02 09:37:45 crc kubenswrapper[4722]: I1002 09:37:45.630908 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" event={"ID":"ec232244-6702-4a68-bbb4-3d9fa7a9a2eb","Type":"ContainerStarted","Data":"18c686c6d7e8cb514b992ae003ed1a76fd2e43693f5add229baaac5abc20c356"} Oct 02 09:37:45 crc kubenswrapper[4722]: I1002 09:37:45.631407 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" event={"ID":"ec232244-6702-4a68-bbb4-3d9fa7a9a2eb","Type":"ContainerStarted","Data":"8ac2833aba19ade516a8de1ae02ce677f4e763ac776bae829f5dc85d2ac53b7c"} Oct 02 09:37:45 crc kubenswrapper[4722]: I1002 09:37:45.631453 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" Oct 02 09:37:45 crc kubenswrapper[4722]: I1002 09:37:45.636501 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" Oct 02 09:37:45 crc kubenswrapper[4722]: I1002 09:37:45.673484 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5f96bbd69c-j7nn2" podStartSLOduration=27.673464033 podStartE2EDuration="27.673464033s" podCreationTimestamp="2025-10-02 09:37:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:37:45.652538754 +0000 UTC m=+321.689291734" watchObservedRunningTime="2025-10-02 09:37:45.673464033 +0000 UTC m=+321.710217013" Oct 02 09:38:02 crc kubenswrapper[4722]: I1002 09:38:02.219603 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9f2qr"] Oct 02 09:38:02 crc kubenswrapper[4722]: I1002 09:38:02.221169 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9f2qr" podUID="5c352a70-fb33-4e66-a950-0e831abd5d45" containerName="registry-server" containerID="cri-o://130c61d4452e7cb63135189227fd8440bd431100dab795b0ad174be593a586db" gracePeriod=30 Oct 02 09:38:02 crc kubenswrapper[4722]: I1002 09:38:02.248003 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p9hf8"] Oct 02 09:38:02 crc kubenswrapper[4722]: I1002 09:38:02.248589 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-p9hf8" podUID="1e7f1ec0-f82a-4911-b120-ba1c25a4a200" containerName="registry-server" containerID="cri-o://da5f92db22136c6da85e2e7bf530e950bc0cdafdab8b82f0df50cf88c7e72212" gracePeriod=30 Oct 02 09:38:02 crc kubenswrapper[4722]: I1002 09:38:02.251823 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qtsxk"] Oct 02 09:38:02 crc kubenswrapper[4722]: I1002 09:38:02.252134 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-qtsxk" podUID="b59e4a45-c4df-4754-8e3e-83e750aee402" containerName="marketplace-operator" containerID="cri-o://89a4fb98e6dccc5f688f2bc02950b914caf07cde7ebe25a62e8ea622d5b2e78f" gracePeriod=30 Oct 02 09:38:02 crc kubenswrapper[4722]: I1002 09:38:02.257947 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x8b9l"] Oct 02 09:38:02 crc kubenswrapper[4722]: I1002 09:38:02.258476 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x8b9l" podUID="a1c11523-129d-450b-bd1f-b72d6a019660" containerName="registry-server" containerID="cri-o://6cd65f97545e9af53c7100c933165842f6c8a81cd57fc93f761084d0aceb1617" gracePeriod=30 Oct 02 09:38:02 crc kubenswrapper[4722]: I1002 09:38:02.272644 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n2vk2"] Oct 02 09:38:02 crc kubenswrapper[4722]: I1002 09:38:02.273026 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n2vk2" podUID="d2ad2457-159c-4841-82fe-ea109055fe28" containerName="registry-server" containerID="cri-o://c09cf7f0eff32335f09b45a808a3b9080c9ff115555c4027130da6889a0b1c49" gracePeriod=30 Oct 02 09:38:02 crc kubenswrapper[4722]: I1002 09:38:02.288389 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nkmjs"] Oct 02 09:38:02 crc kubenswrapper[4722]: I1002 09:38:02.289590 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-nkmjs" Oct 02 09:38:02 crc kubenswrapper[4722]: I1002 09:38:02.299871 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nkmjs"] Oct 02 09:38:02 crc kubenswrapper[4722]: I1002 09:38:02.387347 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bdbbc0cf-ff76-4e3a-903d-e1dff4fb7ba8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nkmjs\" (UID: \"bdbbc0cf-ff76-4e3a-903d-e1dff4fb7ba8\") " pod="openshift-marketplace/marketplace-operator-79b997595-nkmjs" Oct 02 09:38:02 crc kubenswrapper[4722]: I1002 09:38:02.387538 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bdbbc0cf-ff76-4e3a-903d-e1dff4fb7ba8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nkmjs\" (UID: \"bdbbc0cf-ff76-4e3a-903d-e1dff4fb7ba8\") " pod="openshift-marketplace/marketplace-operator-79b997595-nkmjs" Oct 02 09:38:02 crc kubenswrapper[4722]: I1002 09:38:02.387655 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6vfv\" (UniqueName: \"kubernetes.io/projected/bdbbc0cf-ff76-4e3a-903d-e1dff4fb7ba8-kube-api-access-s6vfv\") pod \"marketplace-operator-79b997595-nkmjs\" (UID: \"bdbbc0cf-ff76-4e3a-903d-e1dff4fb7ba8\") " pod="openshift-marketplace/marketplace-operator-79b997595-nkmjs" Oct 02 09:38:02 crc kubenswrapper[4722]: I1002 09:38:02.488836 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bdbbc0cf-ff76-4e3a-903d-e1dff4fb7ba8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nkmjs\" (UID: \"bdbbc0cf-ff76-4e3a-903d-e1dff4fb7ba8\") " pod="openshift-marketplace/marketplace-operator-79b997595-nkmjs" Oct 02 09:38:02 crc kubenswrapper[4722]: I1002 09:38:02.488925 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bdbbc0cf-ff76-4e3a-903d-e1dff4fb7ba8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nkmjs\" (UID: \"bdbbc0cf-ff76-4e3a-903d-e1dff4fb7ba8\") " pod="openshift-marketplace/marketplace-operator-79b997595-nkmjs" Oct 02 09:38:02 crc kubenswrapper[4722]: I1002 09:38:02.488972 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6vfv\" (UniqueName: \"kubernetes.io/projected/bdbbc0cf-ff76-4e3a-903d-e1dff4fb7ba8-kube-api-access-s6vfv\") pod \"marketplace-operator-79b997595-nkmjs\" (UID: \"bdbbc0cf-ff76-4e3a-903d-e1dff4fb7ba8\") " pod="openshift-marketplace/marketplace-operator-79b997595-nkmjs" Oct 02 09:38:02 crc kubenswrapper[4722]: I1002 09:38:02.490263 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bdbbc0cf-ff76-4e3a-903d-e1dff4fb7ba8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nkmjs\" (UID: \"bdbbc0cf-ff76-4e3a-903d-e1dff4fb7ba8\") " pod="openshift-marketplace/marketplace-operator-79b997595-nkmjs" Oct 02 09:38:02 crc kubenswrapper[4722]: I1002 09:38:02.498494 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bdbbc0cf-ff76-4e3a-903d-e1dff4fb7ba8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nkmjs\" (UID: \"bdbbc0cf-ff76-4e3a-903d-e1dff4fb7ba8\") " pod="openshift-marketplace/marketplace-operator-79b997595-nkmjs" Oct 02 09:38:02 crc kubenswrapper[4722]: I1002 09:38:02.507906 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6vfv\" (UniqueName: \"kubernetes.io/projected/bdbbc0cf-ff76-4e3a-903d-e1dff4fb7ba8-kube-api-access-s6vfv\") pod \"marketplace-operator-79b997595-nkmjs\" (UID: \"bdbbc0cf-ff76-4e3a-903d-e1dff4fb7ba8\") " pod="openshift-marketplace/marketplace-operator-79b997595-nkmjs" Oct 02 09:38:02 crc kubenswrapper[4722]: I1002 09:38:02.620069 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-nkmjs" Oct 02 09:38:02 crc kubenswrapper[4722]: I1002 09:38:02.736657 4722 generic.go:334] "Generic (PLEG): container finished" podID="b59e4a45-c4df-4754-8e3e-83e750aee402" containerID="89a4fb98e6dccc5f688f2bc02950b914caf07cde7ebe25a62e8ea622d5b2e78f" exitCode=0 Oct 02 09:38:02 crc kubenswrapper[4722]: I1002 09:38:02.737139 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qtsxk" event={"ID":"b59e4a45-c4df-4754-8e3e-83e750aee402","Type":"ContainerDied","Data":"89a4fb98e6dccc5f688f2bc02950b914caf07cde7ebe25a62e8ea622d5b2e78f"} Oct 02 09:38:02 crc kubenswrapper[4722]: I1002 09:38:02.739785 4722 generic.go:334] "Generic (PLEG): container finished" podID="d2ad2457-159c-4841-82fe-ea109055fe28" containerID="c09cf7f0eff32335f09b45a808a3b9080c9ff115555c4027130da6889a0b1c49" exitCode=0 Oct 02 09:38:02 crc kubenswrapper[4722]: I1002 09:38:02.739838 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2vk2" event={"ID":"d2ad2457-159c-4841-82fe-ea109055fe28","Type":"ContainerDied","Data":"c09cf7f0eff32335f09b45a808a3b9080c9ff115555c4027130da6889a0b1c49"} Oct 02 09:38:02 crc kubenswrapper[4722]: I1002 09:38:02.742829 4722 generic.go:334] "Generic (PLEG): container finished" podID="1e7f1ec0-f82a-4911-b120-ba1c25a4a200" containerID="da5f92db22136c6da85e2e7bf530e950bc0cdafdab8b82f0df50cf88c7e72212" exitCode=0 Oct 02 09:38:02 crc kubenswrapper[4722]: I1002 09:38:02.742892 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p9hf8" event={"ID":"1e7f1ec0-f82a-4911-b120-ba1c25a4a200","Type":"ContainerDied","Data":"da5f92db22136c6da85e2e7bf530e950bc0cdafdab8b82f0df50cf88c7e72212"} Oct 02 09:38:02 crc kubenswrapper[4722]: I1002 09:38:02.747248 4722 generic.go:334] "Generic (PLEG): container finished" podID="5c352a70-fb33-4e66-a950-0e831abd5d45" containerID="130c61d4452e7cb63135189227fd8440bd431100dab795b0ad174be593a586db" exitCode=0 Oct 02 09:38:02 crc kubenswrapper[4722]: I1002 09:38:02.747362 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9f2qr" event={"ID":"5c352a70-fb33-4e66-a950-0e831abd5d45","Type":"ContainerDied","Data":"130c61d4452e7cb63135189227fd8440bd431100dab795b0ad174be593a586db"} Oct 02 09:38:02 crc kubenswrapper[4722]: I1002 09:38:02.749413 4722 generic.go:334] "Generic (PLEG): container finished" podID="a1c11523-129d-450b-bd1f-b72d6a019660" containerID="6cd65f97545e9af53c7100c933165842f6c8a81cd57fc93f761084d0aceb1617" exitCode=0 Oct 02 09:38:02 crc kubenswrapper[4722]: I1002 09:38:02.749454 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x8b9l" event={"ID":"a1c11523-129d-450b-bd1f-b72d6a019660","Type":"ContainerDied","Data":"6cd65f97545e9af53c7100c933165842f6c8a81cd57fc93f761084d0aceb1617"} Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.079582 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nkmjs"] Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.225405 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9f2qr" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.415628 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cphrq\" (UniqueName: \"kubernetes.io/projected/5c352a70-fb33-4e66-a950-0e831abd5d45-kube-api-access-cphrq\") pod \"5c352a70-fb33-4e66-a950-0e831abd5d45\" (UID: \"5c352a70-fb33-4e66-a950-0e831abd5d45\") " Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.416184 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c352a70-fb33-4e66-a950-0e831abd5d45-catalog-content\") pod \"5c352a70-fb33-4e66-a950-0e831abd5d45\" (UID: \"5c352a70-fb33-4e66-a950-0e831abd5d45\") " Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.416291 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c352a70-fb33-4e66-a950-0e831abd5d45-utilities\") pod \"5c352a70-fb33-4e66-a950-0e831abd5d45\" (UID: \"5c352a70-fb33-4e66-a950-0e831abd5d45\") " Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.418646 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c352a70-fb33-4e66-a950-0e831abd5d45-utilities" (OuterVolumeSpecName: "utilities") pod "5c352a70-fb33-4e66-a950-0e831abd5d45" (UID: "5c352a70-fb33-4e66-a950-0e831abd5d45"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.430791 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c352a70-fb33-4e66-a950-0e831abd5d45-kube-api-access-cphrq" (OuterVolumeSpecName: "kube-api-access-cphrq") pod "5c352a70-fb33-4e66-a950-0e831abd5d45" (UID: "5c352a70-fb33-4e66-a950-0e831abd5d45"). InnerVolumeSpecName "kube-api-access-cphrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.482003 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c352a70-fb33-4e66-a950-0e831abd5d45-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5c352a70-fb33-4e66-a950-0e831abd5d45" (UID: "5c352a70-fb33-4e66-a950-0e831abd5d45"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.488507 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qtsxk" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.497285 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p9hf8" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.499757 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n2vk2" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.510963 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x8b9l" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.531233 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c352a70-fb33-4e66-a950-0e831abd5d45-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.531278 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cphrq\" (UniqueName: \"kubernetes.io/projected/5c352a70-fb33-4e66-a950-0e831abd5d45-kube-api-access-cphrq\") on node \"crc\" DevicePath \"\"" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.531294 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c352a70-fb33-4e66-a950-0e831abd5d45-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.632536 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2ad2457-159c-4841-82fe-ea109055fe28-utilities\") pod \"d2ad2457-159c-4841-82fe-ea109055fe28\" (UID: \"d2ad2457-159c-4841-82fe-ea109055fe28\") " Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.632618 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tg9mx\" (UniqueName: \"kubernetes.io/projected/a1c11523-129d-450b-bd1f-b72d6a019660-kube-api-access-tg9mx\") pod \"a1c11523-129d-450b-bd1f-b72d6a019660\" (UID: \"a1c11523-129d-450b-bd1f-b72d6a019660\") " Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.632645 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e7f1ec0-f82a-4911-b120-ba1c25a4a200-utilities\") pod \"1e7f1ec0-f82a-4911-b120-ba1c25a4a200\" (UID: \"1e7f1ec0-f82a-4911-b120-ba1c25a4a200\") " Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.632677 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1c11523-129d-450b-bd1f-b72d6a019660-utilities\") pod \"a1c11523-129d-450b-bd1f-b72d6a019660\" (UID: \"a1c11523-129d-450b-bd1f-b72d6a019660\") " Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.632836 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b59e4a45-c4df-4754-8e3e-83e750aee402-marketplace-operator-metrics\") pod \"b59e4a45-c4df-4754-8e3e-83e750aee402\" (UID: \"b59e4a45-c4df-4754-8e3e-83e750aee402\") " Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.632911 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vqbj\" (UniqueName: \"kubernetes.io/projected/b59e4a45-c4df-4754-8e3e-83e750aee402-kube-api-access-5vqbj\") pod \"b59e4a45-c4df-4754-8e3e-83e750aee402\" (UID: \"b59e4a45-c4df-4754-8e3e-83e750aee402\") " Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.632938 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e7f1ec0-f82a-4911-b120-ba1c25a4a200-catalog-content\") pod \"1e7f1ec0-f82a-4911-b120-ba1c25a4a200\" (UID: \"1e7f1ec0-f82a-4911-b120-ba1c25a4a200\") " Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.632964 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b59e4a45-c4df-4754-8e3e-83e750aee402-marketplace-trusted-ca\") pod \"b59e4a45-c4df-4754-8e3e-83e750aee402\" (UID: \"b59e4a45-c4df-4754-8e3e-83e750aee402\") " Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.633030 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72slq\" (UniqueName: \"kubernetes.io/projected/1e7f1ec0-f82a-4911-b120-ba1c25a4a200-kube-api-access-72slq\") pod \"1e7f1ec0-f82a-4911-b120-ba1c25a4a200\" (UID: \"1e7f1ec0-f82a-4911-b120-ba1c25a4a200\") " Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.633078 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2ad2457-159c-4841-82fe-ea109055fe28-catalog-content\") pod \"d2ad2457-159c-4841-82fe-ea109055fe28\" (UID: \"d2ad2457-159c-4841-82fe-ea109055fe28\") " Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.633106 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1c11523-129d-450b-bd1f-b72d6a019660-catalog-content\") pod \"a1c11523-129d-450b-bd1f-b72d6a019660\" (UID: \"a1c11523-129d-450b-bd1f-b72d6a019660\") " Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.633165 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4f9ng\" (UniqueName: \"kubernetes.io/projected/d2ad2457-159c-4841-82fe-ea109055fe28-kube-api-access-4f9ng\") pod \"d2ad2457-159c-4841-82fe-ea109055fe28\" (UID: \"d2ad2457-159c-4841-82fe-ea109055fe28\") " Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.633424 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2ad2457-159c-4841-82fe-ea109055fe28-utilities" (OuterVolumeSpecName: "utilities") pod "d2ad2457-159c-4841-82fe-ea109055fe28" (UID: "d2ad2457-159c-4841-82fe-ea109055fe28"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.633730 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2ad2457-159c-4841-82fe-ea109055fe28-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.634350 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e7f1ec0-f82a-4911-b120-ba1c25a4a200-utilities" (OuterVolumeSpecName: "utilities") pod "1e7f1ec0-f82a-4911-b120-ba1c25a4a200" (UID: "1e7f1ec0-f82a-4911-b120-ba1c25a4a200"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.634741 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b59e4a45-c4df-4754-8e3e-83e750aee402-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b59e4a45-c4df-4754-8e3e-83e750aee402" (UID: "b59e4a45-c4df-4754-8e3e-83e750aee402"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.636755 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1c11523-129d-450b-bd1f-b72d6a019660-kube-api-access-tg9mx" (OuterVolumeSpecName: "kube-api-access-tg9mx") pod "a1c11523-129d-450b-bd1f-b72d6a019660" (UID: "a1c11523-129d-450b-bd1f-b72d6a019660"). InnerVolumeSpecName "kube-api-access-tg9mx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.636807 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2ad2457-159c-4841-82fe-ea109055fe28-kube-api-access-4f9ng" (OuterVolumeSpecName: "kube-api-access-4f9ng") pod "d2ad2457-159c-4841-82fe-ea109055fe28" (UID: "d2ad2457-159c-4841-82fe-ea109055fe28"). InnerVolumeSpecName "kube-api-access-4f9ng". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.643900 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1c11523-129d-450b-bd1f-b72d6a019660-utilities" (OuterVolumeSpecName: "utilities") pod "a1c11523-129d-450b-bd1f-b72d6a019660" (UID: "a1c11523-129d-450b-bd1f-b72d6a019660"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.644289 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b59e4a45-c4df-4754-8e3e-83e750aee402-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b59e4a45-c4df-4754-8e3e-83e750aee402" (UID: "b59e4a45-c4df-4754-8e3e-83e750aee402"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.644335 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b59e4a45-c4df-4754-8e3e-83e750aee402-kube-api-access-5vqbj" (OuterVolumeSpecName: "kube-api-access-5vqbj") pod "b59e4a45-c4df-4754-8e3e-83e750aee402" (UID: "b59e4a45-c4df-4754-8e3e-83e750aee402"). InnerVolumeSpecName "kube-api-access-5vqbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.644926 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e7f1ec0-f82a-4911-b120-ba1c25a4a200-kube-api-access-72slq" (OuterVolumeSpecName: "kube-api-access-72slq") pod "1e7f1ec0-f82a-4911-b120-ba1c25a4a200" (UID: "1e7f1ec0-f82a-4911-b120-ba1c25a4a200"). InnerVolumeSpecName "kube-api-access-72slq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.655641 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1c11523-129d-450b-bd1f-b72d6a019660-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a1c11523-129d-450b-bd1f-b72d6a019660" (UID: "a1c11523-129d-450b-bd1f-b72d6a019660"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.710304 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e7f1ec0-f82a-4911-b120-ba1c25a4a200-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1e7f1ec0-f82a-4911-b120-ba1c25a4a200" (UID: "1e7f1ec0-f82a-4911-b120-ba1c25a4a200"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.722744 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2ad2457-159c-4841-82fe-ea109055fe28-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d2ad2457-159c-4841-82fe-ea109055fe28" (UID: "d2ad2457-159c-4841-82fe-ea109055fe28"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.735432 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1c11523-129d-450b-bd1f-b72d6a019660-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.735626 4722 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b59e4a45-c4df-4754-8e3e-83e750aee402-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.735715 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vqbj\" (UniqueName: \"kubernetes.io/projected/b59e4a45-c4df-4754-8e3e-83e750aee402-kube-api-access-5vqbj\") on node \"crc\" DevicePath \"\"" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.735798 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e7f1ec0-f82a-4911-b120-ba1c25a4a200-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.735866 4722 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b59e4a45-c4df-4754-8e3e-83e750aee402-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.735950 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72slq\" (UniqueName: \"kubernetes.io/projected/1e7f1ec0-f82a-4911-b120-ba1c25a4a200-kube-api-access-72slq\") on node \"crc\" DevicePath \"\"" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.736016 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2ad2457-159c-4841-82fe-ea109055fe28-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.736104 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1c11523-129d-450b-bd1f-b72d6a019660-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.736171 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4f9ng\" (UniqueName: \"kubernetes.io/projected/d2ad2457-159c-4841-82fe-ea109055fe28-kube-api-access-4f9ng\") on node \"crc\" DevicePath \"\"" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.736231 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tg9mx\" (UniqueName: \"kubernetes.io/projected/a1c11523-129d-450b-bd1f-b72d6a019660-kube-api-access-tg9mx\") on node \"crc\" DevicePath \"\"" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.736286 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e7f1ec0-f82a-4911-b120-ba1c25a4a200-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.757091 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2vk2" event={"ID":"d2ad2457-159c-4841-82fe-ea109055fe28","Type":"ContainerDied","Data":"b9bd2f9b9942cb8b1359c5a8a6ddd457869d8176a0b690be9b22e0ca4e023d61"} Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.757161 4722 scope.go:117] "RemoveContainer" containerID="c09cf7f0eff32335f09b45a808a3b9080c9ff115555c4027130da6889a0b1c49" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.757297 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n2vk2" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.763359 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p9hf8" event={"ID":"1e7f1ec0-f82a-4911-b120-ba1c25a4a200","Type":"ContainerDied","Data":"bf2d679f68c08b0ee3c360eef3f842e8805de28eee48911b32b4dbc84e6b1af1"} Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.763423 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p9hf8" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.767971 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9f2qr" event={"ID":"5c352a70-fb33-4e66-a950-0e831abd5d45","Type":"ContainerDied","Data":"7c505a10b19b5150c6458a308837f7b57a39d881993b59ac6635834f65c4d30e"} Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.768392 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9f2qr" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.771850 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x8b9l" event={"ID":"a1c11523-129d-450b-bd1f-b72d6a019660","Type":"ContainerDied","Data":"536e584e52a908b96f24fdebe628578be9771395a770e786eff418d04a02c2f3"} Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.772070 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x8b9l" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.774264 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nkmjs" event={"ID":"bdbbc0cf-ff76-4e3a-903d-e1dff4fb7ba8","Type":"ContainerStarted","Data":"44d19d2f1349199ccff49df61237758b4d2fd498f903d054f15cca6c8b46e525"} Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.774333 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nkmjs" event={"ID":"bdbbc0cf-ff76-4e3a-903d-e1dff4fb7ba8","Type":"ContainerStarted","Data":"0a775faf585a05b19415464800203467167b3d1e23ad969c82062d4a5be14fe5"} Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.774978 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-nkmjs" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.775970 4722 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-nkmjs container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.55:8080/healthz\": dial tcp 10.217.0.55:8080: connect: connection refused" start-of-body= Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.776235 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-nkmjs" podUID="bdbbc0cf-ff76-4e3a-903d-e1dff4fb7ba8" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.55:8080/healthz\": dial tcp 10.217.0.55:8080: connect: connection refused" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.777227 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qtsxk" event={"ID":"b59e4a45-c4df-4754-8e3e-83e750aee402","Type":"ContainerDied","Data":"76371d9dd430f1ddf40e267b1670ad84e39a2c4c2e75024f29f91f3a7db2c58c"} Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.777295 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qtsxk" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.785101 4722 scope.go:117] "RemoveContainer" containerID="aebd8e60f89edd498d0e2754009812eddc78fe2725b3d8a029cf6714fe068033" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.797377 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-nkmjs" podStartSLOduration=1.797348988 podStartE2EDuration="1.797348988s" podCreationTimestamp="2025-10-02 09:38:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:38:03.794103223 +0000 UTC m=+339.830856203" watchObservedRunningTime="2025-10-02 09:38:03.797348988 +0000 UTC m=+339.834101968" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.808538 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n2vk2"] Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.810709 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n2vk2"] Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.819172 4722 scope.go:117] "RemoveContainer" containerID="f9e273f42a8e41743ee6d02bb596c715950a7a5ba9ef749d728a6ff1dbe73928" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.824867 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9f2qr"] Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.829149 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9f2qr"] Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.831902 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qtsxk"] Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.838160 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qtsxk"] Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.853224 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p9hf8"] Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.858707 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-p9hf8"] Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.861577 4722 scope.go:117] "RemoveContainer" containerID="da5f92db22136c6da85e2e7bf530e950bc0cdafdab8b82f0df50cf88c7e72212" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.873160 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x8b9l"] Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.876294 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x8b9l"] Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.888259 4722 scope.go:117] "RemoveContainer" containerID="0485a102bafff7273eb0d38e8d409f6e271c5eeefa8eb4224921d464b0c19860" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.902303 4722 scope.go:117] "RemoveContainer" containerID="861c05cecc7e7be32a4c87f237d5692e69747ac8726e983ad2b7e9496ef91eb0" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.917534 4722 scope.go:117] "RemoveContainer" containerID="130c61d4452e7cb63135189227fd8440bd431100dab795b0ad174be593a586db" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.933543 4722 scope.go:117] "RemoveContainer" containerID="a56d6926deffff472a1633688231c11ea6a6dfd36281317320dbcd3c432d1918" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.949905 4722 scope.go:117] "RemoveContainer" containerID="a8ab04421e796425247d2fd81d09e955d4da9ec2d47fdbace5106e9705ec8ab0" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.967959 4722 scope.go:117] "RemoveContainer" containerID="6cd65f97545e9af53c7100c933165842f6c8a81cd57fc93f761084d0aceb1617" Oct 02 09:38:03 crc kubenswrapper[4722]: I1002 09:38:03.995704 4722 scope.go:117] "RemoveContainer" containerID="73febdb5045fdc3021b0236dd2fed29399bd0fbeffdefe8c31c73e728f4ee2ce" Oct 02 09:38:04 crc kubenswrapper[4722]: I1002 09:38:04.011294 4722 scope.go:117] "RemoveContainer" containerID="117e8feae45f8b255f2bcbfdf3dcb5ac41938decb2108d76a7d90fd228c57271" Oct 02 09:38:04 crc kubenswrapper[4722]: I1002 09:38:04.025704 4722 scope.go:117] "RemoveContainer" containerID="89a4fb98e6dccc5f688f2bc02950b914caf07cde7ebe25a62e8ea622d5b2e78f" Oct 02 09:38:04 crc kubenswrapper[4722]: I1002 09:38:04.730386 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e7f1ec0-f82a-4911-b120-ba1c25a4a200" path="/var/lib/kubelet/pods/1e7f1ec0-f82a-4911-b120-ba1c25a4a200/volumes" Oct 02 09:38:04 crc kubenswrapper[4722]: I1002 09:38:04.731618 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c352a70-fb33-4e66-a950-0e831abd5d45" path="/var/lib/kubelet/pods/5c352a70-fb33-4e66-a950-0e831abd5d45/volumes" Oct 02 09:38:04 crc kubenswrapper[4722]: I1002 09:38:04.732361 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1c11523-129d-450b-bd1f-b72d6a019660" path="/var/lib/kubelet/pods/a1c11523-129d-450b-bd1f-b72d6a019660/volumes" Oct 02 09:38:04 crc kubenswrapper[4722]: I1002 09:38:04.733839 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b59e4a45-c4df-4754-8e3e-83e750aee402" path="/var/lib/kubelet/pods/b59e4a45-c4df-4754-8e3e-83e750aee402/volumes" Oct 02 09:38:04 crc kubenswrapper[4722]: I1002 09:38:04.734372 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2ad2457-159c-4841-82fe-ea109055fe28" path="/var/lib/kubelet/pods/d2ad2457-159c-4841-82fe-ea109055fe28/volumes" Oct 02 09:38:04 crc kubenswrapper[4722]: I1002 09:38:04.793082 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-nkmjs" Oct 02 09:38:05 crc kubenswrapper[4722]: I1002 09:38:05.239489 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rfdss"] Oct 02 09:38:05 crc kubenswrapper[4722]: E1002 09:38:05.239788 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b59e4a45-c4df-4754-8e3e-83e750aee402" containerName="marketplace-operator" Oct 02 09:38:05 crc kubenswrapper[4722]: I1002 09:38:05.239806 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b59e4a45-c4df-4754-8e3e-83e750aee402" containerName="marketplace-operator" Oct 02 09:38:05 crc kubenswrapper[4722]: E1002 09:38:05.239822 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c352a70-fb33-4e66-a950-0e831abd5d45" containerName="extract-utilities" Oct 02 09:38:05 crc kubenswrapper[4722]: I1002 09:38:05.239829 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c352a70-fb33-4e66-a950-0e831abd5d45" containerName="extract-utilities" Oct 02 09:38:05 crc kubenswrapper[4722]: E1002 09:38:05.239837 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2ad2457-159c-4841-82fe-ea109055fe28" containerName="registry-server" Oct 02 09:38:05 crc kubenswrapper[4722]: I1002 09:38:05.239869 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2ad2457-159c-4841-82fe-ea109055fe28" containerName="registry-server" Oct 02 09:38:05 crc kubenswrapper[4722]: E1002 09:38:05.239877 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1c11523-129d-450b-bd1f-b72d6a019660" containerName="extract-content" Oct 02 09:38:05 crc kubenswrapper[4722]: I1002 09:38:05.239885 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1c11523-129d-450b-bd1f-b72d6a019660" containerName="extract-content" Oct 02 09:38:05 crc kubenswrapper[4722]: E1002 09:38:05.239900 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e7f1ec0-f82a-4911-b120-ba1c25a4a200" containerName="registry-server" Oct 02 09:38:05 crc kubenswrapper[4722]: I1002 09:38:05.239907 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e7f1ec0-f82a-4911-b120-ba1c25a4a200" containerName="registry-server" Oct 02 09:38:05 crc kubenswrapper[4722]: E1002 09:38:05.239918 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2ad2457-159c-4841-82fe-ea109055fe28" containerName="extract-content" Oct 02 09:38:05 crc kubenswrapper[4722]: I1002 09:38:05.239925 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2ad2457-159c-4841-82fe-ea109055fe28" containerName="extract-content" Oct 02 09:38:05 crc kubenswrapper[4722]: E1002 09:38:05.239937 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e7f1ec0-f82a-4911-b120-ba1c25a4a200" containerName="extract-utilities" Oct 02 09:38:05 crc kubenswrapper[4722]: I1002 09:38:05.239945 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e7f1ec0-f82a-4911-b120-ba1c25a4a200" containerName="extract-utilities" Oct 02 09:38:05 crc kubenswrapper[4722]: E1002 09:38:05.239957 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c352a70-fb33-4e66-a950-0e831abd5d45" containerName="extract-content" Oct 02 09:38:05 crc kubenswrapper[4722]: I1002 09:38:05.239965 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c352a70-fb33-4e66-a950-0e831abd5d45" containerName="extract-content" Oct 02 09:38:05 crc kubenswrapper[4722]: E1002 09:38:05.239978 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2ad2457-159c-4841-82fe-ea109055fe28" containerName="extract-utilities" Oct 02 09:38:05 crc kubenswrapper[4722]: I1002 09:38:05.239986 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2ad2457-159c-4841-82fe-ea109055fe28" containerName="extract-utilities" Oct 02 09:38:05 crc kubenswrapper[4722]: E1002 09:38:05.239994 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c352a70-fb33-4e66-a950-0e831abd5d45" containerName="registry-server" Oct 02 09:38:05 crc kubenswrapper[4722]: I1002 09:38:05.240002 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c352a70-fb33-4e66-a950-0e831abd5d45" containerName="registry-server" Oct 02 09:38:05 crc kubenswrapper[4722]: E1002 09:38:05.240013 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1c11523-129d-450b-bd1f-b72d6a019660" containerName="extract-utilities" Oct 02 09:38:05 crc kubenswrapper[4722]: I1002 09:38:05.240021 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1c11523-129d-450b-bd1f-b72d6a019660" containerName="extract-utilities" Oct 02 09:38:05 crc kubenswrapper[4722]: E1002 09:38:05.240032 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1c11523-129d-450b-bd1f-b72d6a019660" containerName="registry-server" Oct 02 09:38:05 crc kubenswrapper[4722]: I1002 09:38:05.240043 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1c11523-129d-450b-bd1f-b72d6a019660" containerName="registry-server" Oct 02 09:38:05 crc kubenswrapper[4722]: E1002 09:38:05.240054 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e7f1ec0-f82a-4911-b120-ba1c25a4a200" containerName="extract-content" Oct 02 09:38:05 crc kubenswrapper[4722]: I1002 09:38:05.240063 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e7f1ec0-f82a-4911-b120-ba1c25a4a200" containerName="extract-content" Oct 02 09:38:05 crc kubenswrapper[4722]: I1002 09:38:05.240185 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e7f1ec0-f82a-4911-b120-ba1c25a4a200" containerName="registry-server" Oct 02 09:38:05 crc kubenswrapper[4722]: I1002 09:38:05.240207 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1c11523-129d-450b-bd1f-b72d6a019660" containerName="registry-server" Oct 02 09:38:05 crc kubenswrapper[4722]: I1002 09:38:05.240216 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2ad2457-159c-4841-82fe-ea109055fe28" containerName="registry-server" Oct 02 09:38:05 crc kubenswrapper[4722]: I1002 09:38:05.240225 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b59e4a45-c4df-4754-8e3e-83e750aee402" containerName="marketplace-operator" Oct 02 09:38:05 crc kubenswrapper[4722]: I1002 09:38:05.240233 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c352a70-fb33-4e66-a950-0e831abd5d45" containerName="registry-server" Oct 02 09:38:05 crc kubenswrapper[4722]: I1002 09:38:05.241220 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rfdss" Oct 02 09:38:05 crc kubenswrapper[4722]: I1002 09:38:05.244404 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 02 09:38:05 crc kubenswrapper[4722]: I1002 09:38:05.255201 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rfdss"] Oct 02 09:38:05 crc kubenswrapper[4722]: I1002 09:38:05.359051 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6q6m\" (UniqueName: \"kubernetes.io/projected/abd654b8-cf56-4df7-840f-50f39edc1d1e-kube-api-access-n6q6m\") pod \"certified-operators-rfdss\" (UID: \"abd654b8-cf56-4df7-840f-50f39edc1d1e\") " pod="openshift-marketplace/certified-operators-rfdss" Oct 02 09:38:05 crc kubenswrapper[4722]: I1002 09:38:05.359152 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abd654b8-cf56-4df7-840f-50f39edc1d1e-catalog-content\") pod \"certified-operators-rfdss\" (UID: \"abd654b8-cf56-4df7-840f-50f39edc1d1e\") " pod="openshift-marketplace/certified-operators-rfdss" Oct 02 09:38:05 crc kubenswrapper[4722]: I1002 09:38:05.359663 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abd654b8-cf56-4df7-840f-50f39edc1d1e-utilities\") pod \"certified-operators-rfdss\" (UID: \"abd654b8-cf56-4df7-840f-50f39edc1d1e\") " pod="openshift-marketplace/certified-operators-rfdss" Oct 02 09:38:05 crc kubenswrapper[4722]: I1002 09:38:05.460833 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abd654b8-cf56-4df7-840f-50f39edc1d1e-catalog-content\") pod \"certified-operators-rfdss\" (UID: \"abd654b8-cf56-4df7-840f-50f39edc1d1e\") " pod="openshift-marketplace/certified-operators-rfdss" Oct 02 09:38:05 crc kubenswrapper[4722]: I1002 09:38:05.460957 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abd654b8-cf56-4df7-840f-50f39edc1d1e-utilities\") pod \"certified-operators-rfdss\" (UID: \"abd654b8-cf56-4df7-840f-50f39edc1d1e\") " pod="openshift-marketplace/certified-operators-rfdss" Oct 02 09:38:05 crc kubenswrapper[4722]: I1002 09:38:05.461002 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6q6m\" (UniqueName: \"kubernetes.io/projected/abd654b8-cf56-4df7-840f-50f39edc1d1e-kube-api-access-n6q6m\") pod \"certified-operators-rfdss\" (UID: \"abd654b8-cf56-4df7-840f-50f39edc1d1e\") " pod="openshift-marketplace/certified-operators-rfdss" Oct 02 09:38:05 crc kubenswrapper[4722]: I1002 09:38:05.461752 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abd654b8-cf56-4df7-840f-50f39edc1d1e-utilities\") pod \"certified-operators-rfdss\" (UID: \"abd654b8-cf56-4df7-840f-50f39edc1d1e\") " pod="openshift-marketplace/certified-operators-rfdss" Oct 02 09:38:05 crc kubenswrapper[4722]: I1002 09:38:05.462268 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abd654b8-cf56-4df7-840f-50f39edc1d1e-catalog-content\") pod \"certified-operators-rfdss\" (UID: \"abd654b8-cf56-4df7-840f-50f39edc1d1e\") " pod="openshift-marketplace/certified-operators-rfdss" Oct 02 09:38:05 crc kubenswrapper[4722]: I1002 09:38:05.484474 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6q6m\" (UniqueName: \"kubernetes.io/projected/abd654b8-cf56-4df7-840f-50f39edc1d1e-kube-api-access-n6q6m\") pod \"certified-operators-rfdss\" (UID: \"abd654b8-cf56-4df7-840f-50f39edc1d1e\") " pod="openshift-marketplace/certified-operators-rfdss" Oct 02 09:38:05 crc kubenswrapper[4722]: I1002 09:38:05.572791 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rfdss" Oct 02 09:38:05 crc kubenswrapper[4722]: I1002 09:38:05.791181 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rfdss"] Oct 02 09:38:05 crc kubenswrapper[4722]: I1002 09:38:05.844402 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-k6qks"] Oct 02 09:38:05 crc kubenswrapper[4722]: I1002 09:38:05.848134 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k6qks" Oct 02 09:38:05 crc kubenswrapper[4722]: I1002 09:38:05.850595 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 02 09:38:05 crc kubenswrapper[4722]: I1002 09:38:05.859342 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k6qks"] Oct 02 09:38:05 crc kubenswrapper[4722]: I1002 09:38:05.967910 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/987e526d-2094-4eb5-a911-c58aee2b53a4-utilities\") pod \"redhat-operators-k6qks\" (UID: \"987e526d-2094-4eb5-a911-c58aee2b53a4\") " pod="openshift-marketplace/redhat-operators-k6qks" Oct 02 09:38:05 crc kubenswrapper[4722]: I1002 09:38:05.968493 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/987e526d-2094-4eb5-a911-c58aee2b53a4-catalog-content\") pod \"redhat-operators-k6qks\" (UID: \"987e526d-2094-4eb5-a911-c58aee2b53a4\") " pod="openshift-marketplace/redhat-operators-k6qks" Oct 02 09:38:05 crc kubenswrapper[4722]: I1002 09:38:05.968542 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpzxt\" (UniqueName: \"kubernetes.io/projected/987e526d-2094-4eb5-a911-c58aee2b53a4-kube-api-access-fpzxt\") pod \"redhat-operators-k6qks\" (UID: \"987e526d-2094-4eb5-a911-c58aee2b53a4\") " pod="openshift-marketplace/redhat-operators-k6qks" Oct 02 09:38:06 crc kubenswrapper[4722]: I1002 09:38:06.070149 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/987e526d-2094-4eb5-a911-c58aee2b53a4-catalog-content\") pod \"redhat-operators-k6qks\" (UID: \"987e526d-2094-4eb5-a911-c58aee2b53a4\") " pod="openshift-marketplace/redhat-operators-k6qks" Oct 02 09:38:06 crc kubenswrapper[4722]: I1002 09:38:06.070208 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpzxt\" (UniqueName: \"kubernetes.io/projected/987e526d-2094-4eb5-a911-c58aee2b53a4-kube-api-access-fpzxt\") pod \"redhat-operators-k6qks\" (UID: \"987e526d-2094-4eb5-a911-c58aee2b53a4\") " pod="openshift-marketplace/redhat-operators-k6qks" Oct 02 09:38:06 crc kubenswrapper[4722]: I1002 09:38:06.070272 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/987e526d-2094-4eb5-a911-c58aee2b53a4-utilities\") pod \"redhat-operators-k6qks\" (UID: \"987e526d-2094-4eb5-a911-c58aee2b53a4\") " pod="openshift-marketplace/redhat-operators-k6qks" Oct 02 09:38:06 crc kubenswrapper[4722]: I1002 09:38:06.071202 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/987e526d-2094-4eb5-a911-c58aee2b53a4-utilities\") pod \"redhat-operators-k6qks\" (UID: \"987e526d-2094-4eb5-a911-c58aee2b53a4\") " pod="openshift-marketplace/redhat-operators-k6qks" Oct 02 09:38:06 crc kubenswrapper[4722]: I1002 09:38:06.071306 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/987e526d-2094-4eb5-a911-c58aee2b53a4-catalog-content\") pod \"redhat-operators-k6qks\" (UID: \"987e526d-2094-4eb5-a911-c58aee2b53a4\") " pod="openshift-marketplace/redhat-operators-k6qks" Oct 02 09:38:06 crc kubenswrapper[4722]: I1002 09:38:06.092000 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpzxt\" (UniqueName: \"kubernetes.io/projected/987e526d-2094-4eb5-a911-c58aee2b53a4-kube-api-access-fpzxt\") pod \"redhat-operators-k6qks\" (UID: \"987e526d-2094-4eb5-a911-c58aee2b53a4\") " pod="openshift-marketplace/redhat-operators-k6qks" Oct 02 09:38:06 crc kubenswrapper[4722]: I1002 09:38:06.203124 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k6qks" Oct 02 09:38:06 crc kubenswrapper[4722]: I1002 09:38:06.606634 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k6qks"] Oct 02 09:38:06 crc kubenswrapper[4722]: W1002 09:38:06.612814 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod987e526d_2094_4eb5_a911_c58aee2b53a4.slice/crio-d5e06972eb2d234765a6d5df58aab79f9568804cfd8b73d8459ff2237799cdb5 WatchSource:0}: Error finding container d5e06972eb2d234765a6d5df58aab79f9568804cfd8b73d8459ff2237799cdb5: Status 404 returned error can't find the container with id d5e06972eb2d234765a6d5df58aab79f9568804cfd8b73d8459ff2237799cdb5 Oct 02 09:38:06 crc kubenswrapper[4722]: I1002 09:38:06.806513 4722 generic.go:334] "Generic (PLEG): container finished" podID="987e526d-2094-4eb5-a911-c58aee2b53a4" containerID="d8621a2de4f28aeebb7c3ca879e1fa22eaeb32ca58b4613ab9969eaf3e71ab1a" exitCode=0 Oct 02 09:38:06 crc kubenswrapper[4722]: I1002 09:38:06.806717 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k6qks" event={"ID":"987e526d-2094-4eb5-a911-c58aee2b53a4","Type":"ContainerDied","Data":"d8621a2de4f28aeebb7c3ca879e1fa22eaeb32ca58b4613ab9969eaf3e71ab1a"} Oct 02 09:38:06 crc kubenswrapper[4722]: I1002 09:38:06.807008 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k6qks" event={"ID":"987e526d-2094-4eb5-a911-c58aee2b53a4","Type":"ContainerStarted","Data":"d5e06972eb2d234765a6d5df58aab79f9568804cfd8b73d8459ff2237799cdb5"} Oct 02 09:38:06 crc kubenswrapper[4722]: I1002 09:38:06.810914 4722 generic.go:334] "Generic (PLEG): container finished" podID="abd654b8-cf56-4df7-840f-50f39edc1d1e" containerID="84662f6a8508c6b9b64a3f179c7ae4cc2efc822e40afee2ba4212b6e00c0f0c5" exitCode=0 Oct 02 09:38:06 crc kubenswrapper[4722]: I1002 09:38:06.810992 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rfdss" event={"ID":"abd654b8-cf56-4df7-840f-50f39edc1d1e","Type":"ContainerDied","Data":"84662f6a8508c6b9b64a3f179c7ae4cc2efc822e40afee2ba4212b6e00c0f0c5"} Oct 02 09:38:06 crc kubenswrapper[4722]: I1002 09:38:06.811032 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rfdss" event={"ID":"abd654b8-cf56-4df7-840f-50f39edc1d1e","Type":"ContainerStarted","Data":"8be149ded22c12beb10f5a2d47c623f254181ec53c36e99cf267572eee386764"} Oct 02 09:38:07 crc kubenswrapper[4722]: I1002 09:38:07.640900 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wcbff"] Oct 02 09:38:07 crc kubenswrapper[4722]: I1002 09:38:07.642521 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wcbff" Oct 02 09:38:07 crc kubenswrapper[4722]: I1002 09:38:07.645515 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 02 09:38:07 crc kubenswrapper[4722]: I1002 09:38:07.656132 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wcbff"] Oct 02 09:38:07 crc kubenswrapper[4722]: I1002 09:38:07.796080 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1aba904b-b2dc-4eaf-b466-36c73f90de8a-utilities\") pod \"community-operators-wcbff\" (UID: \"1aba904b-b2dc-4eaf-b466-36c73f90de8a\") " pod="openshift-marketplace/community-operators-wcbff" Oct 02 09:38:07 crc kubenswrapper[4722]: I1002 09:38:07.796180 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1aba904b-b2dc-4eaf-b466-36c73f90de8a-catalog-content\") pod \"community-operators-wcbff\" (UID: \"1aba904b-b2dc-4eaf-b466-36c73f90de8a\") " pod="openshift-marketplace/community-operators-wcbff" Oct 02 09:38:07 crc kubenswrapper[4722]: I1002 09:38:07.796252 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftv8c\" (UniqueName: \"kubernetes.io/projected/1aba904b-b2dc-4eaf-b466-36c73f90de8a-kube-api-access-ftv8c\") pod \"community-operators-wcbff\" (UID: \"1aba904b-b2dc-4eaf-b466-36c73f90de8a\") " pod="openshift-marketplace/community-operators-wcbff" Oct 02 09:38:07 crc kubenswrapper[4722]: I1002 09:38:07.897636 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftv8c\" (UniqueName: \"kubernetes.io/projected/1aba904b-b2dc-4eaf-b466-36c73f90de8a-kube-api-access-ftv8c\") pod \"community-operators-wcbff\" (UID: \"1aba904b-b2dc-4eaf-b466-36c73f90de8a\") " pod="openshift-marketplace/community-operators-wcbff" Oct 02 09:38:07 crc kubenswrapper[4722]: I1002 09:38:07.897743 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1aba904b-b2dc-4eaf-b466-36c73f90de8a-utilities\") pod \"community-operators-wcbff\" (UID: \"1aba904b-b2dc-4eaf-b466-36c73f90de8a\") " pod="openshift-marketplace/community-operators-wcbff" Oct 02 09:38:07 crc kubenswrapper[4722]: I1002 09:38:07.898004 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1aba904b-b2dc-4eaf-b466-36c73f90de8a-catalog-content\") pod \"community-operators-wcbff\" (UID: \"1aba904b-b2dc-4eaf-b466-36c73f90de8a\") " pod="openshift-marketplace/community-operators-wcbff" Oct 02 09:38:07 crc kubenswrapper[4722]: I1002 09:38:07.898230 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1aba904b-b2dc-4eaf-b466-36c73f90de8a-utilities\") pod \"community-operators-wcbff\" (UID: \"1aba904b-b2dc-4eaf-b466-36c73f90de8a\") " pod="openshift-marketplace/community-operators-wcbff" Oct 02 09:38:07 crc kubenswrapper[4722]: I1002 09:38:07.898420 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1aba904b-b2dc-4eaf-b466-36c73f90de8a-catalog-content\") pod \"community-operators-wcbff\" (UID: \"1aba904b-b2dc-4eaf-b466-36c73f90de8a\") " pod="openshift-marketplace/community-operators-wcbff" Oct 02 09:38:07 crc kubenswrapper[4722]: I1002 09:38:07.927932 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftv8c\" (UniqueName: \"kubernetes.io/projected/1aba904b-b2dc-4eaf-b466-36c73f90de8a-kube-api-access-ftv8c\") pod \"community-operators-wcbff\" (UID: \"1aba904b-b2dc-4eaf-b466-36c73f90de8a\") " pod="openshift-marketplace/community-operators-wcbff" Oct 02 09:38:07 crc kubenswrapper[4722]: I1002 09:38:07.974786 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wcbff" Oct 02 09:38:08 crc kubenswrapper[4722]: I1002 09:38:08.239093 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pc9sc"] Oct 02 09:38:08 crc kubenswrapper[4722]: I1002 09:38:08.261218 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pc9sc"] Oct 02 09:38:08 crc kubenswrapper[4722]: I1002 09:38:08.261410 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pc9sc" Oct 02 09:38:08 crc kubenswrapper[4722]: I1002 09:38:08.266545 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 02 09:38:08 crc kubenswrapper[4722]: I1002 09:38:08.305928 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wcbff"] Oct 02 09:38:08 crc kubenswrapper[4722]: I1002 09:38:08.310911 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd8rl\" (UniqueName: \"kubernetes.io/projected/f7eec357-442f-49c5-a539-067608f33f4f-kube-api-access-kd8rl\") pod \"redhat-marketplace-pc9sc\" (UID: \"f7eec357-442f-49c5-a539-067608f33f4f\") " pod="openshift-marketplace/redhat-marketplace-pc9sc" Oct 02 09:38:08 crc kubenswrapper[4722]: I1002 09:38:08.310964 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7eec357-442f-49c5-a539-067608f33f4f-catalog-content\") pod \"redhat-marketplace-pc9sc\" (UID: \"f7eec357-442f-49c5-a539-067608f33f4f\") " pod="openshift-marketplace/redhat-marketplace-pc9sc" Oct 02 09:38:08 crc kubenswrapper[4722]: I1002 09:38:08.311107 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7eec357-442f-49c5-a539-067608f33f4f-utilities\") pod \"redhat-marketplace-pc9sc\" (UID: \"f7eec357-442f-49c5-a539-067608f33f4f\") " pod="openshift-marketplace/redhat-marketplace-pc9sc" Oct 02 09:38:08 crc kubenswrapper[4722]: I1002 09:38:08.412456 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd8rl\" (UniqueName: \"kubernetes.io/projected/f7eec357-442f-49c5-a539-067608f33f4f-kube-api-access-kd8rl\") pod \"redhat-marketplace-pc9sc\" (UID: \"f7eec357-442f-49c5-a539-067608f33f4f\") " pod="openshift-marketplace/redhat-marketplace-pc9sc" Oct 02 09:38:08 crc kubenswrapper[4722]: I1002 09:38:08.412512 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7eec357-442f-49c5-a539-067608f33f4f-catalog-content\") pod \"redhat-marketplace-pc9sc\" (UID: \"f7eec357-442f-49c5-a539-067608f33f4f\") " pod="openshift-marketplace/redhat-marketplace-pc9sc" Oct 02 09:38:08 crc kubenswrapper[4722]: I1002 09:38:08.412568 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7eec357-442f-49c5-a539-067608f33f4f-utilities\") pod \"redhat-marketplace-pc9sc\" (UID: \"f7eec357-442f-49c5-a539-067608f33f4f\") " pod="openshift-marketplace/redhat-marketplace-pc9sc" Oct 02 09:38:08 crc kubenswrapper[4722]: I1002 09:38:08.413429 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7eec357-442f-49c5-a539-067608f33f4f-utilities\") pod \"redhat-marketplace-pc9sc\" (UID: \"f7eec357-442f-49c5-a539-067608f33f4f\") " pod="openshift-marketplace/redhat-marketplace-pc9sc" Oct 02 09:38:08 crc kubenswrapper[4722]: I1002 09:38:08.413520 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7eec357-442f-49c5-a539-067608f33f4f-catalog-content\") pod \"redhat-marketplace-pc9sc\" (UID: \"f7eec357-442f-49c5-a539-067608f33f4f\") " pod="openshift-marketplace/redhat-marketplace-pc9sc" Oct 02 09:38:08 crc kubenswrapper[4722]: I1002 09:38:08.435459 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd8rl\" (UniqueName: \"kubernetes.io/projected/f7eec357-442f-49c5-a539-067608f33f4f-kube-api-access-kd8rl\") pod \"redhat-marketplace-pc9sc\" (UID: \"f7eec357-442f-49c5-a539-067608f33f4f\") " pod="openshift-marketplace/redhat-marketplace-pc9sc" Oct 02 09:38:08 crc kubenswrapper[4722]: I1002 09:38:08.598267 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pc9sc" Oct 02 09:38:08 crc kubenswrapper[4722]: I1002 09:38:08.831752 4722 generic.go:334] "Generic (PLEG): container finished" podID="1aba904b-b2dc-4eaf-b466-36c73f90de8a" containerID="346c1c684d411705495119b997b6dfedf222e24d27d29b5e26e329f9776b116d" exitCode=0 Oct 02 09:38:08 crc kubenswrapper[4722]: I1002 09:38:08.832787 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wcbff" event={"ID":"1aba904b-b2dc-4eaf-b466-36c73f90de8a","Type":"ContainerDied","Data":"346c1c684d411705495119b997b6dfedf222e24d27d29b5e26e329f9776b116d"} Oct 02 09:38:08 crc kubenswrapper[4722]: I1002 09:38:08.832884 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pc9sc"] Oct 02 09:38:08 crc kubenswrapper[4722]: I1002 09:38:08.832907 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wcbff" event={"ID":"1aba904b-b2dc-4eaf-b466-36c73f90de8a","Type":"ContainerStarted","Data":"3e7b374fb0475e6dab13027e3ac675fccf47c627f4d415b24af5d3da30eb952a"} Oct 02 09:38:08 crc kubenswrapper[4722]: I1002 09:38:08.835971 4722 generic.go:334] "Generic (PLEG): container finished" podID="abd654b8-cf56-4df7-840f-50f39edc1d1e" containerID="dd20f311a194e597b0241a89805c36bec23ea8604371813f93c751f56d2269a9" exitCode=0 Oct 02 09:38:08 crc kubenswrapper[4722]: I1002 09:38:08.836043 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rfdss" event={"ID":"abd654b8-cf56-4df7-840f-50f39edc1d1e","Type":"ContainerDied","Data":"dd20f311a194e597b0241a89805c36bec23ea8604371813f93c751f56d2269a9"} Oct 02 09:38:08 crc kubenswrapper[4722]: I1002 09:38:08.839402 4722 generic.go:334] "Generic (PLEG): container finished" podID="987e526d-2094-4eb5-a911-c58aee2b53a4" containerID="759f1534664afb79d18a45429a00de4c9e50d0fb616a4867b12dc747e0b722ab" exitCode=0 Oct 02 09:38:08 crc kubenswrapper[4722]: I1002 09:38:08.839446 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k6qks" event={"ID":"987e526d-2094-4eb5-a911-c58aee2b53a4","Type":"ContainerDied","Data":"759f1534664afb79d18a45429a00de4c9e50d0fb616a4867b12dc747e0b722ab"} Oct 02 09:38:08 crc kubenswrapper[4722]: W1002 09:38:08.850493 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7eec357_442f_49c5_a539_067608f33f4f.slice/crio-d5619922d8eae86101c90872e9ffcf5f70af32be737e729952535e740ffa07de WatchSource:0}: Error finding container d5619922d8eae86101c90872e9ffcf5f70af32be737e729952535e740ffa07de: Status 404 returned error can't find the container with id d5619922d8eae86101c90872e9ffcf5f70af32be737e729952535e740ffa07de Oct 02 09:38:09 crc kubenswrapper[4722]: I1002 09:38:09.848667 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rfdss" event={"ID":"abd654b8-cf56-4df7-840f-50f39edc1d1e","Type":"ContainerStarted","Data":"10858501b68e7ea2d0cbf7c9b381f3ebf48c9e23766d8c08dbc7b384ef260069"} Oct 02 09:38:09 crc kubenswrapper[4722]: I1002 09:38:09.851690 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k6qks" event={"ID":"987e526d-2094-4eb5-a911-c58aee2b53a4","Type":"ContainerStarted","Data":"dee97e713aceb6a8bb112bc9226d2fbe59aa9037bd830e120bd8c157e1d85565"} Oct 02 09:38:09 crc kubenswrapper[4722]: I1002 09:38:09.854160 4722 generic.go:334] "Generic (PLEG): container finished" podID="f7eec357-442f-49c5-a539-067608f33f4f" containerID="ce0b6df1d616f15ad95927fea158d7defa40ee82213c7fbcf1e7bdbf313c81ba" exitCode=0 Oct 02 09:38:09 crc kubenswrapper[4722]: I1002 09:38:09.854243 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pc9sc" event={"ID":"f7eec357-442f-49c5-a539-067608f33f4f","Type":"ContainerDied","Data":"ce0b6df1d616f15ad95927fea158d7defa40ee82213c7fbcf1e7bdbf313c81ba"} Oct 02 09:38:09 crc kubenswrapper[4722]: I1002 09:38:09.854268 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pc9sc" event={"ID":"f7eec357-442f-49c5-a539-067608f33f4f","Type":"ContainerStarted","Data":"d5619922d8eae86101c90872e9ffcf5f70af32be737e729952535e740ffa07de"} Oct 02 09:38:09 crc kubenswrapper[4722]: I1002 09:38:09.859045 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wcbff" event={"ID":"1aba904b-b2dc-4eaf-b466-36c73f90de8a","Type":"ContainerStarted","Data":"05f2d3f3a64467b016a62d36250dbf6a79a651706704370ed18bdb114b3549d5"} Oct 02 09:38:09 crc kubenswrapper[4722]: I1002 09:38:09.881551 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rfdss" podStartSLOduration=2.420781654 podStartE2EDuration="4.881522276s" podCreationTimestamp="2025-10-02 09:38:05 +0000 UTC" firstStartedPulling="2025-10-02 09:38:06.814043127 +0000 UTC m=+342.850796107" lastFinishedPulling="2025-10-02 09:38:09.274783749 +0000 UTC m=+345.311536729" observedRunningTime="2025-10-02 09:38:09.874617265 +0000 UTC m=+345.911370255" watchObservedRunningTime="2025-10-02 09:38:09.881522276 +0000 UTC m=+345.918275256" Oct 02 09:38:09 crc kubenswrapper[4722]: I1002 09:38:09.936673 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-k6qks" podStartSLOduration=2.17106322 podStartE2EDuration="4.9366434s" podCreationTimestamp="2025-10-02 09:38:05 +0000 UTC" firstStartedPulling="2025-10-02 09:38:06.81074959 +0000 UTC m=+342.847502570" lastFinishedPulling="2025-10-02 09:38:09.57632975 +0000 UTC m=+345.613082750" observedRunningTime="2025-10-02 09:38:09.933917489 +0000 UTC m=+345.970670489" watchObservedRunningTime="2025-10-02 09:38:09.9366434 +0000 UTC m=+345.973396390" Oct 02 09:38:10 crc kubenswrapper[4722]: I1002 09:38:10.866755 4722 generic.go:334] "Generic (PLEG): container finished" podID="1aba904b-b2dc-4eaf-b466-36c73f90de8a" containerID="05f2d3f3a64467b016a62d36250dbf6a79a651706704370ed18bdb114b3549d5" exitCode=0 Oct 02 09:38:10 crc kubenswrapper[4722]: I1002 09:38:10.866853 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wcbff" event={"ID":"1aba904b-b2dc-4eaf-b466-36c73f90de8a","Type":"ContainerDied","Data":"05f2d3f3a64467b016a62d36250dbf6a79a651706704370ed18bdb114b3549d5"} Oct 02 09:38:11 crc kubenswrapper[4722]: I1002 09:38:11.876791 4722 generic.go:334] "Generic (PLEG): container finished" podID="f7eec357-442f-49c5-a539-067608f33f4f" containerID="3d897788c579c98648a0ae983a20ca9e26d56cdf6017fc193acaacea3179e5b3" exitCode=0 Oct 02 09:38:11 crc kubenswrapper[4722]: I1002 09:38:11.876892 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pc9sc" event={"ID":"f7eec357-442f-49c5-a539-067608f33f4f","Type":"ContainerDied","Data":"3d897788c579c98648a0ae983a20ca9e26d56cdf6017fc193acaacea3179e5b3"} Oct 02 09:38:11 crc kubenswrapper[4722]: I1002 09:38:11.884822 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wcbff" event={"ID":"1aba904b-b2dc-4eaf-b466-36c73f90de8a","Type":"ContainerStarted","Data":"fb06582e197bc7d1d17bbdf5ed419603f435cfabdc8adf54de6889d75e56eda8"} Oct 02 09:38:11 crc kubenswrapper[4722]: I1002 09:38:11.930855 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wcbff" podStartSLOduration=2.413231756 podStartE2EDuration="4.930826269s" podCreationTimestamp="2025-10-02 09:38:07 +0000 UTC" firstStartedPulling="2025-10-02 09:38:08.845867661 +0000 UTC m=+344.882620641" lastFinishedPulling="2025-10-02 09:38:11.363462174 +0000 UTC m=+347.400215154" observedRunningTime="2025-10-02 09:38:11.927109692 +0000 UTC m=+347.963862662" watchObservedRunningTime="2025-10-02 09:38:11.930826269 +0000 UTC m=+347.967579249" Oct 02 09:38:12 crc kubenswrapper[4722]: I1002 09:38:12.893747 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pc9sc" event={"ID":"f7eec357-442f-49c5-a539-067608f33f4f","Type":"ContainerStarted","Data":"5a46968f131c64ea5f996034b36adb4c53d9fd66bc94f9aed348877866ce1abd"} Oct 02 09:38:12 crc kubenswrapper[4722]: I1002 09:38:12.916878 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pc9sc" podStartSLOduration=2.386303522 podStartE2EDuration="4.916856823s" podCreationTimestamp="2025-10-02 09:38:08 +0000 UTC" firstStartedPulling="2025-10-02 09:38:09.856820669 +0000 UTC m=+345.893573649" lastFinishedPulling="2025-10-02 09:38:12.38737397 +0000 UTC m=+348.424126950" observedRunningTime="2025-10-02 09:38:12.91597322 +0000 UTC m=+348.952726220" watchObservedRunningTime="2025-10-02 09:38:12.916856823 +0000 UTC m=+348.953609803" Oct 02 09:38:15 crc kubenswrapper[4722]: I1002 09:38:15.573665 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rfdss" Oct 02 09:38:15 crc kubenswrapper[4722]: I1002 09:38:15.574217 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rfdss" Oct 02 09:38:15 crc kubenswrapper[4722]: I1002 09:38:15.619649 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rfdss" Oct 02 09:38:15 crc kubenswrapper[4722]: I1002 09:38:15.965705 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rfdss" Oct 02 09:38:16 crc kubenswrapper[4722]: I1002 09:38:16.204131 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-k6qks" Oct 02 09:38:16 crc kubenswrapper[4722]: I1002 09:38:16.204772 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-k6qks" Oct 02 09:38:16 crc kubenswrapper[4722]: I1002 09:38:16.247620 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-k6qks" Oct 02 09:38:16 crc kubenswrapper[4722]: I1002 09:38:16.961574 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-k6qks" Oct 02 09:38:17 crc kubenswrapper[4722]: I1002 09:38:17.975856 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wcbff" Oct 02 09:38:17 crc kubenswrapper[4722]: I1002 09:38:17.975940 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wcbff" Oct 02 09:38:18 crc kubenswrapper[4722]: I1002 09:38:18.021601 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wcbff" Oct 02 09:38:18 crc kubenswrapper[4722]: I1002 09:38:18.599341 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pc9sc" Oct 02 09:38:18 crc kubenswrapper[4722]: I1002 09:38:18.599418 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pc9sc" Oct 02 09:38:18 crc kubenswrapper[4722]: I1002 09:38:18.648004 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pc9sc" Oct 02 09:38:19 crc kubenswrapper[4722]: I1002 09:38:19.004706 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pc9sc" Oct 02 09:38:19 crc kubenswrapper[4722]: I1002 09:38:19.008494 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wcbff" Oct 02 09:38:40 crc kubenswrapper[4722]: I1002 09:38:40.696711 4722 patch_prober.go:28] interesting pod/machine-config-daemon-q6h76 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 09:38:40 crc kubenswrapper[4722]: I1002 09:38:40.697442 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" podUID="f662826c-e772-4f56-a85f-83971aaef356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 09:39:10 crc kubenswrapper[4722]: I1002 09:39:10.697399 4722 patch_prober.go:28] interesting pod/machine-config-daemon-q6h76 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 09:39:10 crc kubenswrapper[4722]: I1002 09:39:10.698032 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" podUID="f662826c-e772-4f56-a85f-83971aaef356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 09:39:33 crc kubenswrapper[4722]: I1002 09:39:33.912122 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-bq79x"] Oct 02 09:39:33 crc kubenswrapper[4722]: I1002 09:39:33.913496 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-bq79x" Oct 02 09:39:33 crc kubenswrapper[4722]: I1002 09:39:33.937805 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-bq79x"] Oct 02 09:39:34 crc kubenswrapper[4722]: I1002 09:39:34.022396 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-bq79x\" (UID: \"63431093-7442-4749-92c6-38c8e89b7eb4\") " pod="openshift-image-registry/image-registry-66df7c8f76-bq79x" Oct 02 09:39:34 crc kubenswrapper[4722]: I1002 09:39:34.022469 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/63431093-7442-4749-92c6-38c8e89b7eb4-installation-pull-secrets\") pod \"image-registry-66df7c8f76-bq79x\" (UID: \"63431093-7442-4749-92c6-38c8e89b7eb4\") " pod="openshift-image-registry/image-registry-66df7c8f76-bq79x" Oct 02 09:39:34 crc kubenswrapper[4722]: I1002 09:39:34.022506 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/63431093-7442-4749-92c6-38c8e89b7eb4-registry-certificates\") pod \"image-registry-66df7c8f76-bq79x\" (UID: \"63431093-7442-4749-92c6-38c8e89b7eb4\") " pod="openshift-image-registry/image-registry-66df7c8f76-bq79x" Oct 02 09:39:34 crc kubenswrapper[4722]: I1002 09:39:34.022533 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/63431093-7442-4749-92c6-38c8e89b7eb4-ca-trust-extracted\") pod \"image-registry-66df7c8f76-bq79x\" (UID: \"63431093-7442-4749-92c6-38c8e89b7eb4\") " pod="openshift-image-registry/image-registry-66df7c8f76-bq79x" Oct 02 09:39:34 crc kubenswrapper[4722]: I1002 09:39:34.022568 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/63431093-7442-4749-92c6-38c8e89b7eb4-trusted-ca\") pod \"image-registry-66df7c8f76-bq79x\" (UID: \"63431093-7442-4749-92c6-38c8e89b7eb4\") " pod="openshift-image-registry/image-registry-66df7c8f76-bq79x" Oct 02 09:39:34 crc kubenswrapper[4722]: I1002 09:39:34.022612 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csgpb\" (UniqueName: \"kubernetes.io/projected/63431093-7442-4749-92c6-38c8e89b7eb4-kube-api-access-csgpb\") pod \"image-registry-66df7c8f76-bq79x\" (UID: \"63431093-7442-4749-92c6-38c8e89b7eb4\") " pod="openshift-image-registry/image-registry-66df7c8f76-bq79x" Oct 02 09:39:34 crc kubenswrapper[4722]: I1002 09:39:34.022639 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/63431093-7442-4749-92c6-38c8e89b7eb4-registry-tls\") pod \"image-registry-66df7c8f76-bq79x\" (UID: \"63431093-7442-4749-92c6-38c8e89b7eb4\") " pod="openshift-image-registry/image-registry-66df7c8f76-bq79x" Oct 02 09:39:34 crc kubenswrapper[4722]: I1002 09:39:34.022908 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/63431093-7442-4749-92c6-38c8e89b7eb4-bound-sa-token\") pod \"image-registry-66df7c8f76-bq79x\" (UID: \"63431093-7442-4749-92c6-38c8e89b7eb4\") " pod="openshift-image-registry/image-registry-66df7c8f76-bq79x" Oct 02 09:39:34 crc kubenswrapper[4722]: I1002 09:39:34.054335 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-bq79x\" (UID: \"63431093-7442-4749-92c6-38c8e89b7eb4\") " pod="openshift-image-registry/image-registry-66df7c8f76-bq79x" Oct 02 09:39:34 crc kubenswrapper[4722]: I1002 09:39:34.124049 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/63431093-7442-4749-92c6-38c8e89b7eb4-registry-certificates\") pod \"image-registry-66df7c8f76-bq79x\" (UID: \"63431093-7442-4749-92c6-38c8e89b7eb4\") " pod="openshift-image-registry/image-registry-66df7c8f76-bq79x" Oct 02 09:39:34 crc kubenswrapper[4722]: I1002 09:39:34.124112 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/63431093-7442-4749-92c6-38c8e89b7eb4-ca-trust-extracted\") pod \"image-registry-66df7c8f76-bq79x\" (UID: \"63431093-7442-4749-92c6-38c8e89b7eb4\") " pod="openshift-image-registry/image-registry-66df7c8f76-bq79x" Oct 02 09:39:34 crc kubenswrapper[4722]: I1002 09:39:34.124138 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/63431093-7442-4749-92c6-38c8e89b7eb4-trusted-ca\") pod \"image-registry-66df7c8f76-bq79x\" (UID: \"63431093-7442-4749-92c6-38c8e89b7eb4\") " pod="openshift-image-registry/image-registry-66df7c8f76-bq79x" Oct 02 09:39:34 crc kubenswrapper[4722]: I1002 09:39:34.124175 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csgpb\" (UniqueName: \"kubernetes.io/projected/63431093-7442-4749-92c6-38c8e89b7eb4-kube-api-access-csgpb\") pod \"image-registry-66df7c8f76-bq79x\" (UID: \"63431093-7442-4749-92c6-38c8e89b7eb4\") " pod="openshift-image-registry/image-registry-66df7c8f76-bq79x" Oct 02 09:39:34 crc kubenswrapper[4722]: I1002 09:39:34.124197 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/63431093-7442-4749-92c6-38c8e89b7eb4-registry-tls\") pod \"image-registry-66df7c8f76-bq79x\" (UID: \"63431093-7442-4749-92c6-38c8e89b7eb4\") " pod="openshift-image-registry/image-registry-66df7c8f76-bq79x" Oct 02 09:39:34 crc kubenswrapper[4722]: I1002 09:39:34.124245 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/63431093-7442-4749-92c6-38c8e89b7eb4-bound-sa-token\") pod \"image-registry-66df7c8f76-bq79x\" (UID: \"63431093-7442-4749-92c6-38c8e89b7eb4\") " pod="openshift-image-registry/image-registry-66df7c8f76-bq79x" Oct 02 09:39:34 crc kubenswrapper[4722]: I1002 09:39:34.124291 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/63431093-7442-4749-92c6-38c8e89b7eb4-installation-pull-secrets\") pod \"image-registry-66df7c8f76-bq79x\" (UID: \"63431093-7442-4749-92c6-38c8e89b7eb4\") " pod="openshift-image-registry/image-registry-66df7c8f76-bq79x" Oct 02 09:39:34 crc kubenswrapper[4722]: I1002 09:39:34.124775 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/63431093-7442-4749-92c6-38c8e89b7eb4-ca-trust-extracted\") pod \"image-registry-66df7c8f76-bq79x\" (UID: \"63431093-7442-4749-92c6-38c8e89b7eb4\") " pod="openshift-image-registry/image-registry-66df7c8f76-bq79x" Oct 02 09:39:34 crc kubenswrapper[4722]: I1002 09:39:34.126338 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/63431093-7442-4749-92c6-38c8e89b7eb4-registry-certificates\") pod \"image-registry-66df7c8f76-bq79x\" (UID: \"63431093-7442-4749-92c6-38c8e89b7eb4\") " pod="openshift-image-registry/image-registry-66df7c8f76-bq79x" Oct 02 09:39:34 crc kubenswrapper[4722]: I1002 09:39:34.126670 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/63431093-7442-4749-92c6-38c8e89b7eb4-trusted-ca\") pod \"image-registry-66df7c8f76-bq79x\" (UID: \"63431093-7442-4749-92c6-38c8e89b7eb4\") " pod="openshift-image-registry/image-registry-66df7c8f76-bq79x" Oct 02 09:39:34 crc kubenswrapper[4722]: I1002 09:39:34.130822 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/63431093-7442-4749-92c6-38c8e89b7eb4-installation-pull-secrets\") pod \"image-registry-66df7c8f76-bq79x\" (UID: \"63431093-7442-4749-92c6-38c8e89b7eb4\") " pod="openshift-image-registry/image-registry-66df7c8f76-bq79x" Oct 02 09:39:34 crc kubenswrapper[4722]: I1002 09:39:34.130837 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/63431093-7442-4749-92c6-38c8e89b7eb4-registry-tls\") pod \"image-registry-66df7c8f76-bq79x\" (UID: \"63431093-7442-4749-92c6-38c8e89b7eb4\") " pod="openshift-image-registry/image-registry-66df7c8f76-bq79x" Oct 02 09:39:34 crc kubenswrapper[4722]: I1002 09:39:34.143635 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csgpb\" (UniqueName: \"kubernetes.io/projected/63431093-7442-4749-92c6-38c8e89b7eb4-kube-api-access-csgpb\") pod \"image-registry-66df7c8f76-bq79x\" (UID: \"63431093-7442-4749-92c6-38c8e89b7eb4\") " pod="openshift-image-registry/image-registry-66df7c8f76-bq79x" Oct 02 09:39:34 crc kubenswrapper[4722]: I1002 09:39:34.147040 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/63431093-7442-4749-92c6-38c8e89b7eb4-bound-sa-token\") pod \"image-registry-66df7c8f76-bq79x\" (UID: \"63431093-7442-4749-92c6-38c8e89b7eb4\") " pod="openshift-image-registry/image-registry-66df7c8f76-bq79x" Oct 02 09:39:34 crc kubenswrapper[4722]: I1002 09:39:34.230923 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-bq79x" Oct 02 09:39:34 crc kubenswrapper[4722]: I1002 09:39:34.420247 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-bq79x"] Oct 02 09:39:35 crc kubenswrapper[4722]: I1002 09:39:35.442434 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-bq79x" event={"ID":"63431093-7442-4749-92c6-38c8e89b7eb4","Type":"ContainerStarted","Data":"c494223c434508a1d9a79cabb762310e0bed0147e346ae38eed6329b42f3a2c4"} Oct 02 09:39:35 crc kubenswrapper[4722]: I1002 09:39:35.442770 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-bq79x" event={"ID":"63431093-7442-4749-92c6-38c8e89b7eb4","Type":"ContainerStarted","Data":"9fc0032e0200f777883f4f78e0aab2857041752af68caec5efdfc1ca2f0029b6"} Oct 02 09:39:35 crc kubenswrapper[4722]: I1002 09:39:35.443875 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-bq79x" Oct 02 09:39:35 crc kubenswrapper[4722]: I1002 09:39:35.466200 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-bq79x" podStartSLOduration=2.466180471 podStartE2EDuration="2.466180471s" podCreationTimestamp="2025-10-02 09:39:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:39:35.464370874 +0000 UTC m=+431.501123864" watchObservedRunningTime="2025-10-02 09:39:35.466180471 +0000 UTC m=+431.502933451" Oct 02 09:39:40 crc kubenswrapper[4722]: I1002 09:39:40.697282 4722 patch_prober.go:28] interesting pod/machine-config-daemon-q6h76 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 09:39:40 crc kubenswrapper[4722]: I1002 09:39:40.697646 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" podUID="f662826c-e772-4f56-a85f-83971aaef356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 09:39:40 crc kubenswrapper[4722]: I1002 09:39:40.697706 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" Oct 02 09:39:40 crc kubenswrapper[4722]: I1002 09:39:40.698370 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"312c817917c93b621d9ce37d611efa243a39c268516486ea8c814510b384738e"} pod="openshift-machine-config-operator/machine-config-daemon-q6h76" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 09:39:40 crc kubenswrapper[4722]: I1002 09:39:40.698424 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" podUID="f662826c-e772-4f56-a85f-83971aaef356" containerName="machine-config-daemon" containerID="cri-o://312c817917c93b621d9ce37d611efa243a39c268516486ea8c814510b384738e" gracePeriod=600 Oct 02 09:39:41 crc kubenswrapper[4722]: I1002 09:39:41.491910 4722 generic.go:334] "Generic (PLEG): container finished" podID="f662826c-e772-4f56-a85f-83971aaef356" containerID="312c817917c93b621d9ce37d611efa243a39c268516486ea8c814510b384738e" exitCode=0 Oct 02 09:39:41 crc kubenswrapper[4722]: I1002 09:39:41.492208 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" event={"ID":"f662826c-e772-4f56-a85f-83971aaef356","Type":"ContainerDied","Data":"312c817917c93b621d9ce37d611efa243a39c268516486ea8c814510b384738e"} Oct 02 09:39:41 crc kubenswrapper[4722]: I1002 09:39:41.492332 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" event={"ID":"f662826c-e772-4f56-a85f-83971aaef356","Type":"ContainerStarted","Data":"2ed22cde6265a465e12b7d2f595420982dbcb915a0dbd020faa1db13915d4fbc"} Oct 02 09:39:41 crc kubenswrapper[4722]: I1002 09:39:41.492420 4722 scope.go:117] "RemoveContainer" containerID="cfc66c4529a0875a210fbc2b5af96db36ab208d651068958075de46898c1e150" Oct 02 09:39:54 crc kubenswrapper[4722]: I1002 09:39:54.237934 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-bq79x" Oct 02 09:39:55 crc kubenswrapper[4722]: I1002 09:39:54.302173 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kf5f5"] Oct 02 09:40:19 crc kubenswrapper[4722]: I1002 09:40:19.347944 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" podUID="160f98c9-f6b0-4d80-ab97-37487b03a785" containerName="registry" containerID="cri-o://11078d8f98e664e7e2db2a9ac1dcbf641e653b5db9e80e09c6980e12d7a41b27" gracePeriod=30 Oct 02 09:40:19 crc kubenswrapper[4722]: I1002 09:40:19.703047 4722 generic.go:334] "Generic (PLEG): container finished" podID="160f98c9-f6b0-4d80-ab97-37487b03a785" containerID="11078d8f98e664e7e2db2a9ac1dcbf641e653b5db9e80e09c6980e12d7a41b27" exitCode=0 Oct 02 09:40:19 crc kubenswrapper[4722]: I1002 09:40:19.703110 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" event={"ID":"160f98c9-f6b0-4d80-ab97-37487b03a785","Type":"ContainerDied","Data":"11078d8f98e664e7e2db2a9ac1dcbf641e653b5db9e80e09c6980e12d7a41b27"} Oct 02 09:40:19 crc kubenswrapper[4722]: I1002 09:40:19.703148 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" event={"ID":"160f98c9-f6b0-4d80-ab97-37487b03a785","Type":"ContainerDied","Data":"44e20f388432c651ccc007c3ad61ab6bb7dea7e0ce6a6a4c36ea5fb7b5a0d365"} Oct 02 09:40:19 crc kubenswrapper[4722]: I1002 09:40:19.703163 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44e20f388432c651ccc007c3ad61ab6bb7dea7e0ce6a6a4c36ea5fb7b5a0d365" Oct 02 09:40:19 crc kubenswrapper[4722]: I1002 09:40:19.749610 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:40:19 crc kubenswrapper[4722]: I1002 09:40:19.818797 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/160f98c9-f6b0-4d80-ab97-37487b03a785-trusted-ca\") pod \"160f98c9-f6b0-4d80-ab97-37487b03a785\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " Oct 02 09:40:19 crc kubenswrapper[4722]: I1002 09:40:19.819374 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/160f98c9-f6b0-4d80-ab97-37487b03a785-registry-certificates\") pod \"160f98c9-f6b0-4d80-ab97-37487b03a785\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " Oct 02 09:40:19 crc kubenswrapper[4722]: I1002 09:40:19.819874 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/160f98c9-f6b0-4d80-ab97-37487b03a785-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "160f98c9-f6b0-4d80-ab97-37487b03a785" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:40:19 crc kubenswrapper[4722]: I1002 09:40:19.820456 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/160f98c9-f6b0-4d80-ab97-37487b03a785-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "160f98c9-f6b0-4d80-ab97-37487b03a785" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:40:19 crc kubenswrapper[4722]: I1002 09:40:19.820487 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/160f98c9-f6b0-4d80-ab97-37487b03a785-installation-pull-secrets\") pod \"160f98c9-f6b0-4d80-ab97-37487b03a785\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " Oct 02 09:40:19 crc kubenswrapper[4722]: I1002 09:40:19.820561 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/160f98c9-f6b0-4d80-ab97-37487b03a785-ca-trust-extracted\") pod \"160f98c9-f6b0-4d80-ab97-37487b03a785\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " Oct 02 09:40:19 crc kubenswrapper[4722]: I1002 09:40:19.820607 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lw9x6\" (UniqueName: \"kubernetes.io/projected/160f98c9-f6b0-4d80-ab97-37487b03a785-kube-api-access-lw9x6\") pod \"160f98c9-f6b0-4d80-ab97-37487b03a785\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " Oct 02 09:40:19 crc kubenswrapper[4722]: I1002 09:40:19.820685 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/160f98c9-f6b0-4d80-ab97-37487b03a785-registry-tls\") pod \"160f98c9-f6b0-4d80-ab97-37487b03a785\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " Oct 02 09:40:19 crc kubenswrapper[4722]: I1002 09:40:19.820899 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"160f98c9-f6b0-4d80-ab97-37487b03a785\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " Oct 02 09:40:19 crc kubenswrapper[4722]: I1002 09:40:19.820928 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/160f98c9-f6b0-4d80-ab97-37487b03a785-bound-sa-token\") pod \"160f98c9-f6b0-4d80-ab97-37487b03a785\" (UID: \"160f98c9-f6b0-4d80-ab97-37487b03a785\") " Oct 02 09:40:19 crc kubenswrapper[4722]: I1002 09:40:19.821397 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/160f98c9-f6b0-4d80-ab97-37487b03a785-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 09:40:19 crc kubenswrapper[4722]: I1002 09:40:19.821419 4722 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/160f98c9-f6b0-4d80-ab97-37487b03a785-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 02 09:40:19 crc kubenswrapper[4722]: I1002 09:40:19.826653 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/160f98c9-f6b0-4d80-ab97-37487b03a785-kube-api-access-lw9x6" (OuterVolumeSpecName: "kube-api-access-lw9x6") pod "160f98c9-f6b0-4d80-ab97-37487b03a785" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785"). InnerVolumeSpecName "kube-api-access-lw9x6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:40:19 crc kubenswrapper[4722]: I1002 09:40:19.826958 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/160f98c9-f6b0-4d80-ab97-37487b03a785-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "160f98c9-f6b0-4d80-ab97-37487b03a785" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:40:19 crc kubenswrapper[4722]: I1002 09:40:19.827030 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/160f98c9-f6b0-4d80-ab97-37487b03a785-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "160f98c9-f6b0-4d80-ab97-37487b03a785" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:40:19 crc kubenswrapper[4722]: I1002 09:40:19.827215 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/160f98c9-f6b0-4d80-ab97-37487b03a785-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "160f98c9-f6b0-4d80-ab97-37487b03a785" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:40:19 crc kubenswrapper[4722]: I1002 09:40:19.836294 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/160f98c9-f6b0-4d80-ab97-37487b03a785-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "160f98c9-f6b0-4d80-ab97-37487b03a785" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:40:19 crc kubenswrapper[4722]: I1002 09:40:19.839592 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "160f98c9-f6b0-4d80-ab97-37487b03a785" (UID: "160f98c9-f6b0-4d80-ab97-37487b03a785"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 02 09:40:19 crc kubenswrapper[4722]: I1002 09:40:19.923105 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lw9x6\" (UniqueName: \"kubernetes.io/projected/160f98c9-f6b0-4d80-ab97-37487b03a785-kube-api-access-lw9x6\") on node \"crc\" DevicePath \"\"" Oct 02 09:40:19 crc kubenswrapper[4722]: I1002 09:40:19.923356 4722 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/160f98c9-f6b0-4d80-ab97-37487b03a785-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 02 09:40:19 crc kubenswrapper[4722]: I1002 09:40:19.923456 4722 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/160f98c9-f6b0-4d80-ab97-37487b03a785-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 02 09:40:19 crc kubenswrapper[4722]: I1002 09:40:19.923513 4722 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/160f98c9-f6b0-4d80-ab97-37487b03a785-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 02 09:40:19 crc kubenswrapper[4722]: I1002 09:40:19.923559 4722 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/160f98c9-f6b0-4d80-ab97-37487b03a785-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 02 09:40:20 crc kubenswrapper[4722]: I1002 09:40:20.708973 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-kf5f5" Oct 02 09:40:20 crc kubenswrapper[4722]: I1002 09:40:20.746829 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kf5f5"] Oct 02 09:40:20 crc kubenswrapper[4722]: I1002 09:40:20.749842 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kf5f5"] Oct 02 09:40:22 crc kubenswrapper[4722]: I1002 09:40:22.718247 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="160f98c9-f6b0-4d80-ab97-37487b03a785" path="/var/lib/kubelet/pods/160f98c9-f6b0-4d80-ab97-37487b03a785/volumes" Oct 02 09:41:24 crc kubenswrapper[4722]: I1002 09:41:24.841478 4722 scope.go:117] "RemoveContainer" containerID="11078d8f98e664e7e2db2a9ac1dcbf641e653b5db9e80e09c6980e12d7a41b27" Oct 02 09:42:10 crc kubenswrapper[4722]: I1002 09:42:10.696610 4722 patch_prober.go:28] interesting pod/machine-config-daemon-q6h76 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 09:42:10 crc kubenswrapper[4722]: I1002 09:42:10.697206 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" podUID="f662826c-e772-4f56-a85f-83971aaef356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 09:42:40 crc kubenswrapper[4722]: I1002 09:42:40.696615 4722 patch_prober.go:28] interesting pod/machine-config-daemon-q6h76 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 09:42:40 crc kubenswrapper[4722]: I1002 09:42:40.697133 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" podUID="f662826c-e772-4f56-a85f-83971aaef356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 09:43:10 crc kubenswrapper[4722]: I1002 09:43:10.696981 4722 patch_prober.go:28] interesting pod/machine-config-daemon-q6h76 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 09:43:10 crc kubenswrapper[4722]: I1002 09:43:10.697370 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" podUID="f662826c-e772-4f56-a85f-83971aaef356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 09:43:10 crc kubenswrapper[4722]: I1002 09:43:10.697408 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" Oct 02 09:43:10 crc kubenswrapper[4722]: I1002 09:43:10.697927 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2ed22cde6265a465e12b7d2f595420982dbcb915a0dbd020faa1db13915d4fbc"} pod="openshift-machine-config-operator/machine-config-daemon-q6h76" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 09:43:10 crc kubenswrapper[4722]: I1002 09:43:10.697971 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" podUID="f662826c-e772-4f56-a85f-83971aaef356" containerName="machine-config-daemon" containerID="cri-o://2ed22cde6265a465e12b7d2f595420982dbcb915a0dbd020faa1db13915d4fbc" gracePeriod=600 Oct 02 09:43:11 crc kubenswrapper[4722]: I1002 09:43:11.607215 4722 generic.go:334] "Generic (PLEG): container finished" podID="f662826c-e772-4f56-a85f-83971aaef356" containerID="2ed22cde6265a465e12b7d2f595420982dbcb915a0dbd020faa1db13915d4fbc" exitCode=0 Oct 02 09:43:11 crc kubenswrapper[4722]: I1002 09:43:11.607284 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" event={"ID":"f662826c-e772-4f56-a85f-83971aaef356","Type":"ContainerDied","Data":"2ed22cde6265a465e12b7d2f595420982dbcb915a0dbd020faa1db13915d4fbc"} Oct 02 09:43:11 crc kubenswrapper[4722]: I1002 09:43:11.607751 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" event={"ID":"f662826c-e772-4f56-a85f-83971aaef356","Type":"ContainerStarted","Data":"2e23a92796174868d03a49e3fa647f65a4cb15be5ec415c3eed20693726accb6"} Oct 02 09:43:11 crc kubenswrapper[4722]: I1002 09:43:11.607773 4722 scope.go:117] "RemoveContainer" containerID="312c817917c93b621d9ce37d611efa243a39c268516486ea8c814510b384738e" Oct 02 09:44:42 crc kubenswrapper[4722]: I1002 09:44:42.589272 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jlvpl"] Oct 02 09:44:42 crc kubenswrapper[4722]: I1002 09:44:42.591031 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerName="ovn-controller" containerID="cri-o://685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a" gracePeriod=30 Oct 02 09:44:42 crc kubenswrapper[4722]: I1002 09:44:42.591058 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerName="sbdb" containerID="cri-o://3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28" gracePeriod=30 Oct 02 09:44:42 crc kubenswrapper[4722]: I1002 09:44:42.591181 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerName="nbdb" containerID="cri-o://04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c" gracePeriod=30 Oct 02 09:44:42 crc kubenswrapper[4722]: I1002 09:44:42.591233 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerName="northd" containerID="cri-o://f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453" gracePeriod=30 Oct 02 09:44:42 crc kubenswrapper[4722]: I1002 09:44:42.591284 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerName="ovn-acl-logging" containerID="cri-o://e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4" gracePeriod=30 Oct 02 09:44:42 crc kubenswrapper[4722]: I1002 09:44:42.591351 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerName="kube-rbac-proxy-node" containerID="cri-o://9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389" gracePeriod=30 Oct 02 09:44:42 crc kubenswrapper[4722]: I1002 09:44:42.591398 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a" gracePeriod=30 Oct 02 09:44:42 crc kubenswrapper[4722]: I1002 09:44:42.685348 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerName="ovnkube-controller" containerID="cri-o://408cf3f42fa6f355e1374bebc394c4c99071375d6de36e14e374f7008790f204" gracePeriod=30 Oct 02 09:44:42 crc kubenswrapper[4722]: I1002 09:44:42.939973 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jlvpl_a900403a-9c51-4ca5-862f-2fbbd30cbcc3/ovnkube-controller/3.log" Oct 02 09:44:42 crc kubenswrapper[4722]: I1002 09:44:42.941984 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jlvpl_a900403a-9c51-4ca5-862f-2fbbd30cbcc3/ovn-acl-logging/0.log" Oct 02 09:44:42 crc kubenswrapper[4722]: I1002 09:44:42.942552 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jlvpl_a900403a-9c51-4ca5-862f-2fbbd30cbcc3/ovn-controller/0.log" Oct 02 09:44:42 crc kubenswrapper[4722]: I1002 09:44:42.943035 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.000098 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xnfpf"] Oct 02 09:44:43 crc kubenswrapper[4722]: E1002 09:44:43.000773 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerName="ovn-acl-logging" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.000873 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerName="ovn-acl-logging" Oct 02 09:44:43 crc kubenswrapper[4722]: E1002 09:44:43.000961 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerName="northd" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.001032 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerName="northd" Oct 02 09:44:43 crc kubenswrapper[4722]: E1002 09:44:43.001107 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerName="nbdb" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.001176 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerName="nbdb" Oct 02 09:44:43 crc kubenswrapper[4722]: E1002 09:44:43.001250 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerName="ovn-controller" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.001355 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerName="ovn-controller" Oct 02 09:44:43 crc kubenswrapper[4722]: E1002 09:44:43.001451 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerName="sbdb" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.001527 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerName="sbdb" Oct 02 09:44:43 crc kubenswrapper[4722]: E1002 09:44:43.001602 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerName="ovnkube-controller" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.001668 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerName="ovnkube-controller" Oct 02 09:44:43 crc kubenswrapper[4722]: E1002 09:44:43.001744 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerName="ovnkube-controller" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.001815 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerName="ovnkube-controller" Oct 02 09:44:43 crc kubenswrapper[4722]: E1002 09:44:43.001910 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerName="kube-rbac-proxy-ovn-metrics" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.001985 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerName="kube-rbac-proxy-ovn-metrics" Oct 02 09:44:43 crc kubenswrapper[4722]: E1002 09:44:43.002057 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerName="ovnkube-controller" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.002128 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerName="ovnkube-controller" Oct 02 09:44:43 crc kubenswrapper[4722]: E1002 09:44:43.002202 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerName="ovnkube-controller" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.002272 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerName="ovnkube-controller" Oct 02 09:44:43 crc kubenswrapper[4722]: E1002 09:44:43.002366 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerName="ovnkube-controller" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.002454 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerName="ovnkube-controller" Oct 02 09:44:43 crc kubenswrapper[4722]: E1002 09:44:43.002561 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="160f98c9-f6b0-4d80-ab97-37487b03a785" containerName="registry" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.002628 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="160f98c9-f6b0-4d80-ab97-37487b03a785" containerName="registry" Oct 02 09:44:43 crc kubenswrapper[4722]: E1002 09:44:43.002702 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerName="kube-rbac-proxy-node" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.002768 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerName="kube-rbac-proxy-node" Oct 02 09:44:43 crc kubenswrapper[4722]: E1002 09:44:43.002846 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerName="kubecfg-setup" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.002933 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerName="kubecfg-setup" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.003136 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerName="ovnkube-controller" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.003221 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerName="ovn-controller" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.003293 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerName="nbdb" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.003407 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerName="northd" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.003488 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="160f98c9-f6b0-4d80-ab97-37487b03a785" containerName="registry" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.003561 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerName="kube-rbac-proxy-node" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.003628 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerName="sbdb" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.003699 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerName="ovnkube-controller" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.003767 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerName="ovnkube-controller" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.003835 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerName="ovn-acl-logging" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.003902 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerName="kube-rbac-proxy-ovn-metrics" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.004245 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerName="ovnkube-controller" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.004546 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerName="ovnkube-controller" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.006974 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.081681 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jlvpl_a900403a-9c51-4ca5-862f-2fbbd30cbcc3/ovnkube-controller/3.log" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.084414 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jlvpl_a900403a-9c51-4ca5-862f-2fbbd30cbcc3/ovn-acl-logging/0.log" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.085400 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jlvpl_a900403a-9c51-4ca5-862f-2fbbd30cbcc3/ovn-controller/0.log" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.085915 4722 generic.go:334] "Generic (PLEG): container finished" podID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerID="408cf3f42fa6f355e1374bebc394c4c99071375d6de36e14e374f7008790f204" exitCode=0 Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.086026 4722 generic.go:334] "Generic (PLEG): container finished" podID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerID="3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28" exitCode=0 Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.086085 4722 generic.go:334] "Generic (PLEG): container finished" podID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerID="04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c" exitCode=0 Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.086143 4722 generic.go:334] "Generic (PLEG): container finished" podID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerID="f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453" exitCode=0 Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.086196 4722 generic.go:334] "Generic (PLEG): container finished" podID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerID="6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a" exitCode=0 Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.086255 4722 generic.go:334] "Generic (PLEG): container finished" podID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerID="9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389" exitCode=0 Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.086308 4722 generic.go:334] "Generic (PLEG): container finished" podID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerID="e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4" exitCode=143 Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.086414 4722 generic.go:334] "Generic (PLEG): container finished" podID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" containerID="685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a" exitCode=143 Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.085992 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" event={"ID":"a900403a-9c51-4ca5-862f-2fbbd30cbcc3","Type":"ContainerDied","Data":"408cf3f42fa6f355e1374bebc394c4c99071375d6de36e14e374f7008790f204"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.086062 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.086565 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" event={"ID":"a900403a-9c51-4ca5-862f-2fbbd30cbcc3","Type":"ContainerDied","Data":"3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.086612 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" event={"ID":"a900403a-9c51-4ca5-862f-2fbbd30cbcc3","Type":"ContainerDied","Data":"04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.086628 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" event={"ID":"a900403a-9c51-4ca5-862f-2fbbd30cbcc3","Type":"ContainerDied","Data":"f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.086696 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" event={"ID":"a900403a-9c51-4ca5-862f-2fbbd30cbcc3","Type":"ContainerDied","Data":"6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.086710 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" event={"ID":"a900403a-9c51-4ca5-862f-2fbbd30cbcc3","Type":"ContainerDied","Data":"9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.086726 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"523bd1dd360fb1d2b7a23339cf39f643b7dbd12a237ec876e659b1a174797d90"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.086741 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.086749 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.086755 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.086763 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.086769 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.086672 4722 scope.go:117] "RemoveContainer" containerID="408cf3f42fa6f355e1374bebc394c4c99071375d6de36e14e374f7008790f204" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.086775 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.086880 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.086888 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.086898 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" event={"ID":"a900403a-9c51-4ca5-862f-2fbbd30cbcc3","Type":"ContainerDied","Data":"e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.086909 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"408cf3f42fa6f355e1374bebc394c4c99071375d6de36e14e374f7008790f204"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.087293 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"523bd1dd360fb1d2b7a23339cf39f643b7dbd12a237ec876e659b1a174797d90"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.087324 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.087333 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.087339 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.087345 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.087350 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.087356 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.087363 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.087368 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.087378 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" event={"ID":"a900403a-9c51-4ca5-862f-2fbbd30cbcc3","Type":"ContainerDied","Data":"685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.087391 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"408cf3f42fa6f355e1374bebc394c4c99071375d6de36e14e374f7008790f204"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.087398 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"523bd1dd360fb1d2b7a23339cf39f643b7dbd12a237ec876e659b1a174797d90"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.087404 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.087410 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.087416 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.087421 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.087427 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.087435 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.087441 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.087447 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.087456 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jlvpl" event={"ID":"a900403a-9c51-4ca5-862f-2fbbd30cbcc3","Type":"ContainerDied","Data":"852db608e5f10cf96b37d5670b47f7763fbe2e293ec97cd6faf3a5d2bc626dfd"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.087465 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"408cf3f42fa6f355e1374bebc394c4c99071375d6de36e14e374f7008790f204"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.087472 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"523bd1dd360fb1d2b7a23339cf39f643b7dbd12a237ec876e659b1a174797d90"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.087479 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.087485 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.087491 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.087496 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.087502 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.087508 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.087514 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.087520 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.093139 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zwlns_04c84986-75a1-4683-8764-8f7307d1126a/kube-multus/2.log" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.094124 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zwlns_04c84986-75a1-4683-8764-8f7307d1126a/kube-multus/1.log" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.094192 4722 generic.go:334] "Generic (PLEG): container finished" podID="04c84986-75a1-4683-8764-8f7307d1126a" containerID="3b7bf9f2844675ba4d9daf32117df4cf7541f82b5f6c4e42c2b03ddfb891ecda" exitCode=2 Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.094230 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zwlns" event={"ID":"04c84986-75a1-4683-8764-8f7307d1126a","Type":"ContainerDied","Data":"3b7bf9f2844675ba4d9daf32117df4cf7541f82b5f6c4e42c2b03ddfb891ecda"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.094254 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"aae2e270e209af7e6afbf96e995421ccb4fd0aa09be5011bb0458307a8d5788a"} Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.095181 4722 scope.go:117] "RemoveContainer" containerID="3b7bf9f2844675ba4d9daf32117df4cf7541f82b5f6c4e42c2b03ddfb891ecda" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.112282 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-ovnkube-script-lib\") pod \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.112361 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.112397 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-host-cni-netd\") pod \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.112441 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-host-run-ovn-kubernetes\") pod \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.112507 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-env-overrides\") pod \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.112531 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-run-ovn\") pod \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.112550 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-host-slash\") pod \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.112569 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-run-systemd\") pod \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.112597 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-etc-openvswitch\") pod \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.112616 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-run-openvswitch\") pod \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.112635 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-ovn-node-metrics-cert\") pod \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.112682 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-log-socket\") pod \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.112708 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-host-run-netns\") pod \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.112733 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-systemd-units\") pod \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.112760 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-host-kubelet\") pod \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.112785 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-host-cni-bin\") pod \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.112810 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pfvc\" (UniqueName: \"kubernetes.io/projected/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-kube-api-access-4pfvc\") pod \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.112825 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-node-log\") pod \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.112845 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-var-lib-openvswitch\") pod \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.112867 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-ovnkube-config\") pod \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\" (UID: \"a900403a-9c51-4ca5-862f-2fbbd30cbcc3\") " Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.113114 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/834dc7e3-cb17-4efd-9d54-dca44ccf2188-host-slash\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.113149 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/834dc7e3-cb17-4efd-9d54-dca44ccf2188-ovnkube-config\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.113175 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/834dc7e3-cb17-4efd-9d54-dca44ccf2188-run-ovn\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.113194 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2prv6\" (UniqueName: \"kubernetes.io/projected/834dc7e3-cb17-4efd-9d54-dca44ccf2188-kube-api-access-2prv6\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.113214 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/834dc7e3-cb17-4efd-9d54-dca44ccf2188-host-cni-bin\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.113233 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/834dc7e3-cb17-4efd-9d54-dca44ccf2188-node-log\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.113253 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/834dc7e3-cb17-4efd-9d54-dca44ccf2188-systemd-units\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.113489 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/834dc7e3-cb17-4efd-9d54-dca44ccf2188-var-lib-openvswitch\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.113511 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/834dc7e3-cb17-4efd-9d54-dca44ccf2188-host-cni-netd\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.113536 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/834dc7e3-cb17-4efd-9d54-dca44ccf2188-host-run-netns\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.113555 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/834dc7e3-cb17-4efd-9d54-dca44ccf2188-etc-openvswitch\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.113583 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/834dc7e3-cb17-4efd-9d54-dca44ccf2188-run-systemd\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.113622 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/834dc7e3-cb17-4efd-9d54-dca44ccf2188-run-openvswitch\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.113652 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/834dc7e3-cb17-4efd-9d54-dca44ccf2188-env-overrides\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.113689 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/834dc7e3-cb17-4efd-9d54-dca44ccf2188-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.113711 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/834dc7e3-cb17-4efd-9d54-dca44ccf2188-host-run-ovn-kubernetes\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.113748 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/834dc7e3-cb17-4efd-9d54-dca44ccf2188-log-socket\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.113769 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/834dc7e3-cb17-4efd-9d54-dca44ccf2188-host-kubelet\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.113791 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/834dc7e3-cb17-4efd-9d54-dca44ccf2188-ovn-node-metrics-cert\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.113808 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/834dc7e3-cb17-4efd-9d54-dca44ccf2188-ovnkube-script-lib\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.113974 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "a900403a-9c51-4ca5-862f-2fbbd30cbcc3" (UID: "a900403a-9c51-4ca5-862f-2fbbd30cbcc3"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.114008 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "a900403a-9c51-4ca5-862f-2fbbd30cbcc3" (UID: "a900403a-9c51-4ca5-862f-2fbbd30cbcc3"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.114032 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "a900403a-9c51-4ca5-862f-2fbbd30cbcc3" (UID: "a900403a-9c51-4ca5-862f-2fbbd30cbcc3"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.114054 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "a900403a-9c51-4ca5-862f-2fbbd30cbcc3" (UID: "a900403a-9c51-4ca5-862f-2fbbd30cbcc3"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.114057 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "a900403a-9c51-4ca5-862f-2fbbd30cbcc3" (UID: "a900403a-9c51-4ca5-862f-2fbbd30cbcc3"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.114673 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "a900403a-9c51-4ca5-862f-2fbbd30cbcc3" (UID: "a900403a-9c51-4ca5-862f-2fbbd30cbcc3"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.114743 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-host-slash" (OuterVolumeSpecName: "host-slash") pod "a900403a-9c51-4ca5-862f-2fbbd30cbcc3" (UID: "a900403a-9c51-4ca5-862f-2fbbd30cbcc3"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.114763 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "a900403a-9c51-4ca5-862f-2fbbd30cbcc3" (UID: "a900403a-9c51-4ca5-862f-2fbbd30cbcc3"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.114771 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "a900403a-9c51-4ca5-862f-2fbbd30cbcc3" (UID: "a900403a-9c51-4ca5-862f-2fbbd30cbcc3"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.114833 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-log-socket" (OuterVolumeSpecName: "log-socket") pod "a900403a-9c51-4ca5-862f-2fbbd30cbcc3" (UID: "a900403a-9c51-4ca5-862f-2fbbd30cbcc3"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.114859 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "a900403a-9c51-4ca5-862f-2fbbd30cbcc3" (UID: "a900403a-9c51-4ca5-862f-2fbbd30cbcc3"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.114884 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-node-log" (OuterVolumeSpecName: "node-log") pod "a900403a-9c51-4ca5-862f-2fbbd30cbcc3" (UID: "a900403a-9c51-4ca5-862f-2fbbd30cbcc3"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.114901 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "a900403a-9c51-4ca5-862f-2fbbd30cbcc3" (UID: "a900403a-9c51-4ca5-862f-2fbbd30cbcc3"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.114944 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "a900403a-9c51-4ca5-862f-2fbbd30cbcc3" (UID: "a900403a-9c51-4ca5-862f-2fbbd30cbcc3"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.115303 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "a900403a-9c51-4ca5-862f-2fbbd30cbcc3" (UID: "a900403a-9c51-4ca5-862f-2fbbd30cbcc3"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.115679 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "a900403a-9c51-4ca5-862f-2fbbd30cbcc3" (UID: "a900403a-9c51-4ca5-862f-2fbbd30cbcc3"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.115707 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "a900403a-9c51-4ca5-862f-2fbbd30cbcc3" (UID: "a900403a-9c51-4ca5-862f-2fbbd30cbcc3"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.118513 4722 scope.go:117] "RemoveContainer" containerID="523bd1dd360fb1d2b7a23339cf39f643b7dbd12a237ec876e659b1a174797d90" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.122450 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-kube-api-access-4pfvc" (OuterVolumeSpecName: "kube-api-access-4pfvc") pod "a900403a-9c51-4ca5-862f-2fbbd30cbcc3" (UID: "a900403a-9c51-4ca5-862f-2fbbd30cbcc3"). InnerVolumeSpecName "kube-api-access-4pfvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.124115 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "a900403a-9c51-4ca5-862f-2fbbd30cbcc3" (UID: "a900403a-9c51-4ca5-862f-2fbbd30cbcc3"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.130782 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "a900403a-9c51-4ca5-862f-2fbbd30cbcc3" (UID: "a900403a-9c51-4ca5-862f-2fbbd30cbcc3"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.135682 4722 scope.go:117] "RemoveContainer" containerID="3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.150219 4722 scope.go:117] "RemoveContainer" containerID="04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.172751 4722 scope.go:117] "RemoveContainer" containerID="f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.194680 4722 scope.go:117] "RemoveContainer" containerID="6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.207556 4722 scope.go:117] "RemoveContainer" containerID="9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.215189 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/834dc7e3-cb17-4efd-9d54-dca44ccf2188-log-socket\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.215266 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/834dc7e3-cb17-4efd-9d54-dca44ccf2188-host-kubelet\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.215309 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/834dc7e3-cb17-4efd-9d54-dca44ccf2188-ovn-node-metrics-cert\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.215353 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/834dc7e3-cb17-4efd-9d54-dca44ccf2188-ovnkube-script-lib\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.215364 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/834dc7e3-cb17-4efd-9d54-dca44ccf2188-log-socket\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.215604 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/834dc7e3-cb17-4efd-9d54-dca44ccf2188-host-kubelet\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.215628 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/834dc7e3-cb17-4efd-9d54-dca44ccf2188-host-slash\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.215584 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/834dc7e3-cb17-4efd-9d54-dca44ccf2188-host-slash\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.215855 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/834dc7e3-cb17-4efd-9d54-dca44ccf2188-ovnkube-config\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.215885 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/834dc7e3-cb17-4efd-9d54-dca44ccf2188-run-ovn\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.215903 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2prv6\" (UniqueName: \"kubernetes.io/projected/834dc7e3-cb17-4efd-9d54-dca44ccf2188-kube-api-access-2prv6\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.215931 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/834dc7e3-cb17-4efd-9d54-dca44ccf2188-host-cni-bin\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.215947 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/834dc7e3-cb17-4efd-9d54-dca44ccf2188-node-log\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.215968 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/834dc7e3-cb17-4efd-9d54-dca44ccf2188-systemd-units\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.215993 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/834dc7e3-cb17-4efd-9d54-dca44ccf2188-var-lib-openvswitch\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.216016 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/834dc7e3-cb17-4efd-9d54-dca44ccf2188-host-cni-netd\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.216046 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/834dc7e3-cb17-4efd-9d54-dca44ccf2188-host-run-netns\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.216068 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/834dc7e3-cb17-4efd-9d54-dca44ccf2188-etc-openvswitch\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.216104 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/834dc7e3-cb17-4efd-9d54-dca44ccf2188-run-systemd\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.216145 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/834dc7e3-cb17-4efd-9d54-dca44ccf2188-run-openvswitch\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.216181 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/834dc7e3-cb17-4efd-9d54-dca44ccf2188-env-overrides\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.216214 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/834dc7e3-cb17-4efd-9d54-dca44ccf2188-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.216239 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/834dc7e3-cb17-4efd-9d54-dca44ccf2188-host-run-ovn-kubernetes\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.216335 4722 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.216346 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pfvc\" (UniqueName: \"kubernetes.io/projected/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-kube-api-access-4pfvc\") on node \"crc\" DevicePath \"\"" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.216358 4722 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.216366 4722 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-node-log\") on node \"crc\" DevicePath \"\"" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.216375 4722 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.216384 4722 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.216393 4722 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.216402 4722 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.216412 4722 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.216421 4722 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.216429 4722 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.216438 4722 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-host-slash\") on node \"crc\" DevicePath \"\"" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.216446 4722 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.216454 4722 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.216462 4722 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.216470 4722 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.216480 4722 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-log-socket\") on node \"crc\" DevicePath \"\"" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.216488 4722 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.216498 4722 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.216506 4722 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a900403a-9c51-4ca5-862f-2fbbd30cbcc3-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.217078 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/834dc7e3-cb17-4efd-9d54-dca44ccf2188-host-run-netns\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.217126 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/834dc7e3-cb17-4efd-9d54-dca44ccf2188-run-ovn\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.217207 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/834dc7e3-cb17-4efd-9d54-dca44ccf2188-ovnkube-script-lib\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.217277 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/834dc7e3-cb17-4efd-9d54-dca44ccf2188-systemd-units\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.217359 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/834dc7e3-cb17-4efd-9d54-dca44ccf2188-host-cni-bin\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.217392 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/834dc7e3-cb17-4efd-9d54-dca44ccf2188-etc-openvswitch\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.217405 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/834dc7e3-cb17-4efd-9d54-dca44ccf2188-node-log\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.217436 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/834dc7e3-cb17-4efd-9d54-dca44ccf2188-var-lib-openvswitch\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.217445 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/834dc7e3-cb17-4efd-9d54-dca44ccf2188-run-systemd\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.217499 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/834dc7e3-cb17-4efd-9d54-dca44ccf2188-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.217506 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/834dc7e3-cb17-4efd-9d54-dca44ccf2188-host-cni-netd\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.217532 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/834dc7e3-cb17-4efd-9d54-dca44ccf2188-host-run-ovn-kubernetes\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.217528 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/834dc7e3-cb17-4efd-9d54-dca44ccf2188-ovnkube-config\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.217564 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/834dc7e3-cb17-4efd-9d54-dca44ccf2188-run-openvswitch\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.217840 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/834dc7e3-cb17-4efd-9d54-dca44ccf2188-env-overrides\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.222129 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/834dc7e3-cb17-4efd-9d54-dca44ccf2188-ovn-node-metrics-cert\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.233254 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2prv6\" (UniqueName: \"kubernetes.io/projected/834dc7e3-cb17-4efd-9d54-dca44ccf2188-kube-api-access-2prv6\") pod \"ovnkube-node-xnfpf\" (UID: \"834dc7e3-cb17-4efd-9d54-dca44ccf2188\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.236607 4722 scope.go:117] "RemoveContainer" containerID="e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.253714 4722 scope.go:117] "RemoveContainer" containerID="685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.269266 4722 scope.go:117] "RemoveContainer" containerID="7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.283083 4722 scope.go:117] "RemoveContainer" containerID="408cf3f42fa6f355e1374bebc394c4c99071375d6de36e14e374f7008790f204" Oct 02 09:44:43 crc kubenswrapper[4722]: E1002 09:44:43.283794 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"408cf3f42fa6f355e1374bebc394c4c99071375d6de36e14e374f7008790f204\": container with ID starting with 408cf3f42fa6f355e1374bebc394c4c99071375d6de36e14e374f7008790f204 not found: ID does not exist" containerID="408cf3f42fa6f355e1374bebc394c4c99071375d6de36e14e374f7008790f204" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.283842 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"408cf3f42fa6f355e1374bebc394c4c99071375d6de36e14e374f7008790f204"} err="failed to get container status \"408cf3f42fa6f355e1374bebc394c4c99071375d6de36e14e374f7008790f204\": rpc error: code = NotFound desc = could not find container \"408cf3f42fa6f355e1374bebc394c4c99071375d6de36e14e374f7008790f204\": container with ID starting with 408cf3f42fa6f355e1374bebc394c4c99071375d6de36e14e374f7008790f204 not found: ID does not exist" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.283873 4722 scope.go:117] "RemoveContainer" containerID="523bd1dd360fb1d2b7a23339cf39f643b7dbd12a237ec876e659b1a174797d90" Oct 02 09:44:43 crc kubenswrapper[4722]: E1002 09:44:43.284226 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"523bd1dd360fb1d2b7a23339cf39f643b7dbd12a237ec876e659b1a174797d90\": container with ID starting with 523bd1dd360fb1d2b7a23339cf39f643b7dbd12a237ec876e659b1a174797d90 not found: ID does not exist" containerID="523bd1dd360fb1d2b7a23339cf39f643b7dbd12a237ec876e659b1a174797d90" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.284257 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"523bd1dd360fb1d2b7a23339cf39f643b7dbd12a237ec876e659b1a174797d90"} err="failed to get container status \"523bd1dd360fb1d2b7a23339cf39f643b7dbd12a237ec876e659b1a174797d90\": rpc error: code = NotFound desc = could not find container \"523bd1dd360fb1d2b7a23339cf39f643b7dbd12a237ec876e659b1a174797d90\": container with ID starting with 523bd1dd360fb1d2b7a23339cf39f643b7dbd12a237ec876e659b1a174797d90 not found: ID does not exist" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.284270 4722 scope.go:117] "RemoveContainer" containerID="3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28" Oct 02 09:44:43 crc kubenswrapper[4722]: E1002 09:44:43.284533 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28\": container with ID starting with 3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28 not found: ID does not exist" containerID="3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.284558 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28"} err="failed to get container status \"3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28\": rpc error: code = NotFound desc = could not find container \"3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28\": container with ID starting with 3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28 not found: ID does not exist" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.284571 4722 scope.go:117] "RemoveContainer" containerID="04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c" Oct 02 09:44:43 crc kubenswrapper[4722]: E1002 09:44:43.284751 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c\": container with ID starting with 04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c not found: ID does not exist" containerID="04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.284774 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c"} err="failed to get container status \"04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c\": rpc error: code = NotFound desc = could not find container \"04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c\": container with ID starting with 04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c not found: ID does not exist" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.284786 4722 scope.go:117] "RemoveContainer" containerID="f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453" Oct 02 09:44:43 crc kubenswrapper[4722]: E1002 09:44:43.285020 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453\": container with ID starting with f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453 not found: ID does not exist" containerID="f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.285045 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453"} err="failed to get container status \"f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453\": rpc error: code = NotFound desc = could not find container \"f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453\": container with ID starting with f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453 not found: ID does not exist" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.285065 4722 scope.go:117] "RemoveContainer" containerID="6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a" Oct 02 09:44:43 crc kubenswrapper[4722]: E1002 09:44:43.285522 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a\": container with ID starting with 6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a not found: ID does not exist" containerID="6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.285549 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a"} err="failed to get container status \"6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a\": rpc error: code = NotFound desc = could not find container \"6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a\": container with ID starting with 6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a not found: ID does not exist" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.285562 4722 scope.go:117] "RemoveContainer" containerID="9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389" Oct 02 09:44:43 crc kubenswrapper[4722]: E1002 09:44:43.285846 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389\": container with ID starting with 9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389 not found: ID does not exist" containerID="9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.285868 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389"} err="failed to get container status \"9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389\": rpc error: code = NotFound desc = could not find container \"9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389\": container with ID starting with 9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389 not found: ID does not exist" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.285881 4722 scope.go:117] "RemoveContainer" containerID="e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4" Oct 02 09:44:43 crc kubenswrapper[4722]: E1002 09:44:43.286285 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4\": container with ID starting with e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4 not found: ID does not exist" containerID="e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.286352 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4"} err="failed to get container status \"e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4\": rpc error: code = NotFound desc = could not find container \"e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4\": container with ID starting with e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4 not found: ID does not exist" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.286434 4722 scope.go:117] "RemoveContainer" containerID="685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a" Oct 02 09:44:43 crc kubenswrapper[4722]: E1002 09:44:43.286915 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a\": container with ID starting with 685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a not found: ID does not exist" containerID="685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.286979 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a"} err="failed to get container status \"685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a\": rpc error: code = NotFound desc = could not find container \"685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a\": container with ID starting with 685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a not found: ID does not exist" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.287020 4722 scope.go:117] "RemoveContainer" containerID="7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346" Oct 02 09:44:43 crc kubenswrapper[4722]: E1002 09:44:43.287373 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\": container with ID starting with 7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346 not found: ID does not exist" containerID="7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.287445 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346"} err="failed to get container status \"7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\": rpc error: code = NotFound desc = could not find container \"7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\": container with ID starting with 7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346 not found: ID does not exist" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.287461 4722 scope.go:117] "RemoveContainer" containerID="408cf3f42fa6f355e1374bebc394c4c99071375d6de36e14e374f7008790f204" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.287734 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"408cf3f42fa6f355e1374bebc394c4c99071375d6de36e14e374f7008790f204"} err="failed to get container status \"408cf3f42fa6f355e1374bebc394c4c99071375d6de36e14e374f7008790f204\": rpc error: code = NotFound desc = could not find container \"408cf3f42fa6f355e1374bebc394c4c99071375d6de36e14e374f7008790f204\": container with ID starting with 408cf3f42fa6f355e1374bebc394c4c99071375d6de36e14e374f7008790f204 not found: ID does not exist" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.287756 4722 scope.go:117] "RemoveContainer" containerID="523bd1dd360fb1d2b7a23339cf39f643b7dbd12a237ec876e659b1a174797d90" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.288066 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"523bd1dd360fb1d2b7a23339cf39f643b7dbd12a237ec876e659b1a174797d90"} err="failed to get container status \"523bd1dd360fb1d2b7a23339cf39f643b7dbd12a237ec876e659b1a174797d90\": rpc error: code = NotFound desc = could not find container \"523bd1dd360fb1d2b7a23339cf39f643b7dbd12a237ec876e659b1a174797d90\": container with ID starting with 523bd1dd360fb1d2b7a23339cf39f643b7dbd12a237ec876e659b1a174797d90 not found: ID does not exist" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.288094 4722 scope.go:117] "RemoveContainer" containerID="3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.288294 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28"} err="failed to get container status \"3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28\": rpc error: code = NotFound desc = could not find container \"3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28\": container with ID starting with 3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28 not found: ID does not exist" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.288351 4722 scope.go:117] "RemoveContainer" containerID="04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.288715 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c"} err="failed to get container status \"04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c\": rpc error: code = NotFound desc = could not find container \"04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c\": container with ID starting with 04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c not found: ID does not exist" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.288741 4722 scope.go:117] "RemoveContainer" containerID="f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.289031 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453"} err="failed to get container status \"f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453\": rpc error: code = NotFound desc = could not find container \"f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453\": container with ID starting with f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453 not found: ID does not exist" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.289072 4722 scope.go:117] "RemoveContainer" containerID="6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.289365 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a"} err="failed to get container status \"6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a\": rpc error: code = NotFound desc = could not find container \"6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a\": container with ID starting with 6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a not found: ID does not exist" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.289405 4722 scope.go:117] "RemoveContainer" containerID="9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.289666 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389"} err="failed to get container status \"9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389\": rpc error: code = NotFound desc = could not find container \"9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389\": container with ID starting with 9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389 not found: ID does not exist" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.289694 4722 scope.go:117] "RemoveContainer" containerID="e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.289955 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4"} err="failed to get container status \"e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4\": rpc error: code = NotFound desc = could not find container \"e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4\": container with ID starting with e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4 not found: ID does not exist" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.290000 4722 scope.go:117] "RemoveContainer" containerID="685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.290247 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a"} err="failed to get container status \"685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a\": rpc error: code = NotFound desc = could not find container \"685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a\": container with ID starting with 685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a not found: ID does not exist" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.290271 4722 scope.go:117] "RemoveContainer" containerID="7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.290509 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346"} err="failed to get container status \"7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\": rpc error: code = NotFound desc = could not find container \"7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\": container with ID starting with 7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346 not found: ID does not exist" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.290534 4722 scope.go:117] "RemoveContainer" containerID="408cf3f42fa6f355e1374bebc394c4c99071375d6de36e14e374f7008790f204" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.290833 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"408cf3f42fa6f355e1374bebc394c4c99071375d6de36e14e374f7008790f204"} err="failed to get container status \"408cf3f42fa6f355e1374bebc394c4c99071375d6de36e14e374f7008790f204\": rpc error: code = NotFound desc = could not find container \"408cf3f42fa6f355e1374bebc394c4c99071375d6de36e14e374f7008790f204\": container with ID starting with 408cf3f42fa6f355e1374bebc394c4c99071375d6de36e14e374f7008790f204 not found: ID does not exist" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.290848 4722 scope.go:117] "RemoveContainer" containerID="523bd1dd360fb1d2b7a23339cf39f643b7dbd12a237ec876e659b1a174797d90" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.291098 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"523bd1dd360fb1d2b7a23339cf39f643b7dbd12a237ec876e659b1a174797d90"} err="failed to get container status \"523bd1dd360fb1d2b7a23339cf39f643b7dbd12a237ec876e659b1a174797d90\": rpc error: code = NotFound desc = could not find container \"523bd1dd360fb1d2b7a23339cf39f643b7dbd12a237ec876e659b1a174797d90\": container with ID starting with 523bd1dd360fb1d2b7a23339cf39f643b7dbd12a237ec876e659b1a174797d90 not found: ID does not exist" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.291122 4722 scope.go:117] "RemoveContainer" containerID="3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.291439 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28"} err="failed to get container status \"3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28\": rpc error: code = NotFound desc = could not find container \"3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28\": container with ID starting with 3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28 not found: ID does not exist" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.291459 4722 scope.go:117] "RemoveContainer" containerID="04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.291760 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c"} err="failed to get container status \"04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c\": rpc error: code = NotFound desc = could not find container \"04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c\": container with ID starting with 04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c not found: ID does not exist" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.291798 4722 scope.go:117] "RemoveContainer" containerID="f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.292088 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453"} err="failed to get container status \"f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453\": rpc error: code = NotFound desc = could not find container \"f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453\": container with ID starting with f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453 not found: ID does not exist" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.292110 4722 scope.go:117] "RemoveContainer" containerID="6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.292368 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a"} err="failed to get container status \"6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a\": rpc error: code = NotFound desc = could not find container \"6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a\": container with ID starting with 6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a not found: ID does not exist" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.292395 4722 scope.go:117] "RemoveContainer" containerID="9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.292734 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389"} err="failed to get container status \"9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389\": rpc error: code = NotFound desc = could not find container \"9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389\": container with ID starting with 9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389 not found: ID does not exist" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.292759 4722 scope.go:117] "RemoveContainer" containerID="e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.293045 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4"} err="failed to get container status \"e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4\": rpc error: code = NotFound desc = could not find container \"e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4\": container with ID starting with e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4 not found: ID does not exist" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.293091 4722 scope.go:117] "RemoveContainer" containerID="685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.293351 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a"} err="failed to get container status \"685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a\": rpc error: code = NotFound desc = could not find container \"685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a\": container with ID starting with 685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a not found: ID does not exist" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.293376 4722 scope.go:117] "RemoveContainer" containerID="7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.293605 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346"} err="failed to get container status \"7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\": rpc error: code = NotFound desc = could not find container \"7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\": container with ID starting with 7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346 not found: ID does not exist" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.293641 4722 scope.go:117] "RemoveContainer" containerID="408cf3f42fa6f355e1374bebc394c4c99071375d6de36e14e374f7008790f204" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.293927 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"408cf3f42fa6f355e1374bebc394c4c99071375d6de36e14e374f7008790f204"} err="failed to get container status \"408cf3f42fa6f355e1374bebc394c4c99071375d6de36e14e374f7008790f204\": rpc error: code = NotFound desc = could not find container \"408cf3f42fa6f355e1374bebc394c4c99071375d6de36e14e374f7008790f204\": container with ID starting with 408cf3f42fa6f355e1374bebc394c4c99071375d6de36e14e374f7008790f204 not found: ID does not exist" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.293956 4722 scope.go:117] "RemoveContainer" containerID="523bd1dd360fb1d2b7a23339cf39f643b7dbd12a237ec876e659b1a174797d90" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.294191 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"523bd1dd360fb1d2b7a23339cf39f643b7dbd12a237ec876e659b1a174797d90"} err="failed to get container status \"523bd1dd360fb1d2b7a23339cf39f643b7dbd12a237ec876e659b1a174797d90\": rpc error: code = NotFound desc = could not find container \"523bd1dd360fb1d2b7a23339cf39f643b7dbd12a237ec876e659b1a174797d90\": container with ID starting with 523bd1dd360fb1d2b7a23339cf39f643b7dbd12a237ec876e659b1a174797d90 not found: ID does not exist" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.294211 4722 scope.go:117] "RemoveContainer" containerID="3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.294425 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28"} err="failed to get container status \"3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28\": rpc error: code = NotFound desc = could not find container \"3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28\": container with ID starting with 3acb9c824ed1c2edc3093107b02ff9e56f4b9d4bc59c730f6d698934bc1b8c28 not found: ID does not exist" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.294447 4722 scope.go:117] "RemoveContainer" containerID="04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.294732 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c"} err="failed to get container status \"04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c\": rpc error: code = NotFound desc = could not find container \"04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c\": container with ID starting with 04526f44dc9d5fde75c6cc4faf35d3c6aea06d38f7bb3ec9ca879938aa52e92c not found: ID does not exist" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.294754 4722 scope.go:117] "RemoveContainer" containerID="f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.295060 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453"} err="failed to get container status \"f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453\": rpc error: code = NotFound desc = could not find container \"f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453\": container with ID starting with f6fd66355f7ab0e2234ef393a1adf928f3690e6f6832deec7e866c185f0f3453 not found: ID does not exist" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.295079 4722 scope.go:117] "RemoveContainer" containerID="6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.295289 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a"} err="failed to get container status \"6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a\": rpc error: code = NotFound desc = could not find container \"6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a\": container with ID starting with 6302533475f934b956a93466de2dc54e05045a9079eb379a21dcb37b41e71d0a not found: ID does not exist" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.295306 4722 scope.go:117] "RemoveContainer" containerID="9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.295532 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389"} err="failed to get container status \"9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389\": rpc error: code = NotFound desc = could not find container \"9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389\": container with ID starting with 9d91262b0af3a4e93003ca32e6c438b1bfbc17a00bbebd40404b13c092a35389 not found: ID does not exist" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.295553 4722 scope.go:117] "RemoveContainer" containerID="e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.295821 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4"} err="failed to get container status \"e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4\": rpc error: code = NotFound desc = could not find container \"e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4\": container with ID starting with e6f831a7370fa5252abbd8696457d67a6ce9ea2d5f144c672e8828eeeb1c47e4 not found: ID does not exist" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.295844 4722 scope.go:117] "RemoveContainer" containerID="685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.296210 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a"} err="failed to get container status \"685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a\": rpc error: code = NotFound desc = could not find container \"685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a\": container with ID starting with 685be3b245436ea8ef63026d72046eaa9d3a8091615f88dd6038d19e79f1f51a not found: ID does not exist" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.296238 4722 scope.go:117] "RemoveContainer" containerID="7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.296631 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346"} err="failed to get container status \"7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\": rpc error: code = NotFound desc = could not find container \"7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346\": container with ID starting with 7cdbaee61e467074f384962c0a1a0dc00b2eb96122e1531b8d48eeea0fc01346 not found: ID does not exist" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.296675 4722 scope.go:117] "RemoveContainer" containerID="408cf3f42fa6f355e1374bebc394c4c99071375d6de36e14e374f7008790f204" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.296937 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"408cf3f42fa6f355e1374bebc394c4c99071375d6de36e14e374f7008790f204"} err="failed to get container status \"408cf3f42fa6f355e1374bebc394c4c99071375d6de36e14e374f7008790f204\": rpc error: code = NotFound desc = could not find container \"408cf3f42fa6f355e1374bebc394c4c99071375d6de36e14e374f7008790f204\": container with ID starting with 408cf3f42fa6f355e1374bebc394c4c99071375d6de36e14e374f7008790f204 not found: ID does not exist" Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.324490 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:43 crc kubenswrapper[4722]: W1002 09:44:43.345473 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod834dc7e3_cb17_4efd_9d54_dca44ccf2188.slice/crio-e585b5cd17133775e2abef99ebaf0aa173b74052f914868384259ad6eae098e8 WatchSource:0}: Error finding container e585b5cd17133775e2abef99ebaf0aa173b74052f914868384259ad6eae098e8: Status 404 returned error can't find the container with id e585b5cd17133775e2abef99ebaf0aa173b74052f914868384259ad6eae098e8 Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.430198 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jlvpl"] Oct 02 09:44:43 crc kubenswrapper[4722]: I1002 09:44:43.432843 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jlvpl"] Oct 02 09:44:44 crc kubenswrapper[4722]: I1002 09:44:44.103819 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zwlns_04c84986-75a1-4683-8764-8f7307d1126a/kube-multus/2.log" Oct 02 09:44:44 crc kubenswrapper[4722]: I1002 09:44:44.104701 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zwlns_04c84986-75a1-4683-8764-8f7307d1126a/kube-multus/1.log" Oct 02 09:44:44 crc kubenswrapper[4722]: I1002 09:44:44.104901 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zwlns" event={"ID":"04c84986-75a1-4683-8764-8f7307d1126a","Type":"ContainerStarted","Data":"57d4eac19b50b62f339967516440b2ac11d5f84e55ac2ab012ce7c95b8cef1c4"} Oct 02 09:44:44 crc kubenswrapper[4722]: I1002 09:44:44.106913 4722 generic.go:334] "Generic (PLEG): container finished" podID="834dc7e3-cb17-4efd-9d54-dca44ccf2188" containerID="207052d921d4bb70f371d14b48e90a5c908e455bc3a818e9d4a3e7ffc53fb9e0" exitCode=0 Oct 02 09:44:44 crc kubenswrapper[4722]: I1002 09:44:44.106962 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" event={"ID":"834dc7e3-cb17-4efd-9d54-dca44ccf2188","Type":"ContainerDied","Data":"207052d921d4bb70f371d14b48e90a5c908e455bc3a818e9d4a3e7ffc53fb9e0"} Oct 02 09:44:44 crc kubenswrapper[4722]: I1002 09:44:44.107054 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" event={"ID":"834dc7e3-cb17-4efd-9d54-dca44ccf2188","Type":"ContainerStarted","Data":"e585b5cd17133775e2abef99ebaf0aa173b74052f914868384259ad6eae098e8"} Oct 02 09:44:44 crc kubenswrapper[4722]: I1002 09:44:44.718060 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a900403a-9c51-4ca5-862f-2fbbd30cbcc3" path="/var/lib/kubelet/pods/a900403a-9c51-4ca5-862f-2fbbd30cbcc3/volumes" Oct 02 09:44:45 crc kubenswrapper[4722]: I1002 09:44:45.116964 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" event={"ID":"834dc7e3-cb17-4efd-9d54-dca44ccf2188","Type":"ContainerStarted","Data":"cd535ba56eba9306056d505ee7ad958e1db53c65996f72292554f54239309521"} Oct 02 09:44:45 crc kubenswrapper[4722]: I1002 09:44:45.117422 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" event={"ID":"834dc7e3-cb17-4efd-9d54-dca44ccf2188","Type":"ContainerStarted","Data":"3a2bbaf3185c4fcbb9f40a993e8cfe9e5bf23effd8063ee6aa06bbdcba489581"} Oct 02 09:44:45 crc kubenswrapper[4722]: I1002 09:44:45.117437 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" event={"ID":"834dc7e3-cb17-4efd-9d54-dca44ccf2188","Type":"ContainerStarted","Data":"83150c167cc6966fc5ee60717085e9a157e6fecb48207c60ed985bf06d6a36fd"} Oct 02 09:44:45 crc kubenswrapper[4722]: I1002 09:44:45.117447 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" event={"ID":"834dc7e3-cb17-4efd-9d54-dca44ccf2188","Type":"ContainerStarted","Data":"6fad5025f1d30336dd0f9631ce40bfeab6ce91fcb69bcc2085abaf91b4b5d80a"} Oct 02 09:44:45 crc kubenswrapper[4722]: I1002 09:44:45.117456 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" event={"ID":"834dc7e3-cb17-4efd-9d54-dca44ccf2188","Type":"ContainerStarted","Data":"4688e72e17b3dc8ce544b10d65c3dfa26635dfe88123eea63b10b5b0c63e7a3b"} Oct 02 09:44:45 crc kubenswrapper[4722]: I1002 09:44:45.117466 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" event={"ID":"834dc7e3-cb17-4efd-9d54-dca44ccf2188","Type":"ContainerStarted","Data":"6e24decd5058cbb42e5917cfc66c023f6ccc4dbffe440c8c708f3c31de629b7d"} Oct 02 09:44:47 crc kubenswrapper[4722]: I1002 09:44:47.133683 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" event={"ID":"834dc7e3-cb17-4efd-9d54-dca44ccf2188","Type":"ContainerStarted","Data":"11d3999975655451412ee29a587a6cc0a154d5565aa6fba2403288e8d3ca54b3"} Oct 02 09:44:51 crc kubenswrapper[4722]: I1002 09:44:51.157020 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" event={"ID":"834dc7e3-cb17-4efd-9d54-dca44ccf2188","Type":"ContainerStarted","Data":"e7496bf2dfe0ac826720e58dfca8cc30c452c8b2ef4114bcb9118434039b40ae"} Oct 02 09:44:51 crc kubenswrapper[4722]: I1002 09:44:51.158001 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:51 crc kubenswrapper[4722]: I1002 09:44:51.158017 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:51 crc kubenswrapper[4722]: I1002 09:44:51.158026 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:51 crc kubenswrapper[4722]: I1002 09:44:51.187982 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:51 crc kubenswrapper[4722]: I1002 09:44:51.189447 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:44:51 crc kubenswrapper[4722]: I1002 09:44:51.195922 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" podStartSLOduration=9.195896377 podStartE2EDuration="9.195896377s" podCreationTimestamp="2025-10-02 09:44:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:44:51.194568874 +0000 UTC m=+747.231321864" watchObservedRunningTime="2025-10-02 09:44:51.195896377 +0000 UTC m=+747.232649357" Oct 02 09:44:55 crc kubenswrapper[4722]: I1002 09:44:55.720438 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zk6t7"] Oct 02 09:44:55 crc kubenswrapper[4722]: I1002 09:44:55.721391 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-zk6t7" podUID="e6070617-4dba-40fb-a6a8-03f6807a6846" containerName="controller-manager" containerID="cri-o://b0fb8662bd4f1081d8140dd72cb713be6e97e74a296372d38b1677e1d3793b7b" gracePeriod=30 Oct 02 09:44:55 crc kubenswrapper[4722]: I1002 09:44:55.822939 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-pf6dm"] Oct 02 09:44:55 crc kubenswrapper[4722]: I1002 09:44:55.823495 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pf6dm" podUID="4104d0d5-52a2-47da-9cd0-86d4c52bc122" containerName="route-controller-manager" containerID="cri-o://f0d0a6890e32e6337bf39f4e43cfa0b6f045e101f7a3f7698da8337461d2c197" gracePeriod=30 Oct 02 09:44:56 crc kubenswrapper[4722]: I1002 09:44:56.152890 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zk6t7" Oct 02 09:44:56 crc kubenswrapper[4722]: I1002 09:44:56.188515 4722 generic.go:334] "Generic (PLEG): container finished" podID="4104d0d5-52a2-47da-9cd0-86d4c52bc122" containerID="f0d0a6890e32e6337bf39f4e43cfa0b6f045e101f7a3f7698da8337461d2c197" exitCode=0 Oct 02 09:44:56 crc kubenswrapper[4722]: I1002 09:44:56.188604 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pf6dm" event={"ID":"4104d0d5-52a2-47da-9cd0-86d4c52bc122","Type":"ContainerDied","Data":"f0d0a6890e32e6337bf39f4e43cfa0b6f045e101f7a3f7698da8337461d2c197"} Oct 02 09:44:56 crc kubenswrapper[4722]: I1002 09:44:56.190545 4722 generic.go:334] "Generic (PLEG): container finished" podID="e6070617-4dba-40fb-a6a8-03f6807a6846" containerID="b0fb8662bd4f1081d8140dd72cb713be6e97e74a296372d38b1677e1d3793b7b" exitCode=0 Oct 02 09:44:56 crc kubenswrapper[4722]: I1002 09:44:56.190585 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zk6t7" event={"ID":"e6070617-4dba-40fb-a6a8-03f6807a6846","Type":"ContainerDied","Data":"b0fb8662bd4f1081d8140dd72cb713be6e97e74a296372d38b1677e1d3793b7b"} Oct 02 09:44:56 crc kubenswrapper[4722]: I1002 09:44:56.190594 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zk6t7" Oct 02 09:44:56 crc kubenswrapper[4722]: I1002 09:44:56.190607 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zk6t7" event={"ID":"e6070617-4dba-40fb-a6a8-03f6807a6846","Type":"ContainerDied","Data":"e1041cd64d5292462736a179ea4c83f1274dba1122330f23dba78842236b2f70"} Oct 02 09:44:56 crc kubenswrapper[4722]: I1002 09:44:56.190628 4722 scope.go:117] "RemoveContainer" containerID="b0fb8662bd4f1081d8140dd72cb713be6e97e74a296372d38b1677e1d3793b7b" Oct 02 09:44:56 crc kubenswrapper[4722]: I1002 09:44:56.212497 4722 scope.go:117] "RemoveContainer" containerID="b0fb8662bd4f1081d8140dd72cb713be6e97e74a296372d38b1677e1d3793b7b" Oct 02 09:44:56 crc kubenswrapper[4722]: E1002 09:44:56.212883 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0fb8662bd4f1081d8140dd72cb713be6e97e74a296372d38b1677e1d3793b7b\": container with ID starting with b0fb8662bd4f1081d8140dd72cb713be6e97e74a296372d38b1677e1d3793b7b not found: ID does not exist" containerID="b0fb8662bd4f1081d8140dd72cb713be6e97e74a296372d38b1677e1d3793b7b" Oct 02 09:44:56 crc kubenswrapper[4722]: I1002 09:44:56.212921 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0fb8662bd4f1081d8140dd72cb713be6e97e74a296372d38b1677e1d3793b7b"} err="failed to get container status \"b0fb8662bd4f1081d8140dd72cb713be6e97e74a296372d38b1677e1d3793b7b\": rpc error: code = NotFound desc = could not find container \"b0fb8662bd4f1081d8140dd72cb713be6e97e74a296372d38b1677e1d3793b7b\": container with ID starting with b0fb8662bd4f1081d8140dd72cb713be6e97e74a296372d38b1677e1d3793b7b not found: ID does not exist" Oct 02 09:44:56 crc kubenswrapper[4722]: I1002 09:44:56.284546 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6070617-4dba-40fb-a6a8-03f6807a6846-config\") pod \"e6070617-4dba-40fb-a6a8-03f6807a6846\" (UID: \"e6070617-4dba-40fb-a6a8-03f6807a6846\") " Oct 02 09:44:56 crc kubenswrapper[4722]: I1002 09:44:56.284878 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e6070617-4dba-40fb-a6a8-03f6807a6846-client-ca\") pod \"e6070617-4dba-40fb-a6a8-03f6807a6846\" (UID: \"e6070617-4dba-40fb-a6a8-03f6807a6846\") " Oct 02 09:44:56 crc kubenswrapper[4722]: I1002 09:44:56.284951 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e6070617-4dba-40fb-a6a8-03f6807a6846-proxy-ca-bundles\") pod \"e6070617-4dba-40fb-a6a8-03f6807a6846\" (UID: \"e6070617-4dba-40fb-a6a8-03f6807a6846\") " Oct 02 09:44:56 crc kubenswrapper[4722]: I1002 09:44:56.284994 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6070617-4dba-40fb-a6a8-03f6807a6846-serving-cert\") pod \"e6070617-4dba-40fb-a6a8-03f6807a6846\" (UID: \"e6070617-4dba-40fb-a6a8-03f6807a6846\") " Oct 02 09:44:56 crc kubenswrapper[4722]: I1002 09:44:56.285045 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2gv4\" (UniqueName: \"kubernetes.io/projected/e6070617-4dba-40fb-a6a8-03f6807a6846-kube-api-access-g2gv4\") pod \"e6070617-4dba-40fb-a6a8-03f6807a6846\" (UID: \"e6070617-4dba-40fb-a6a8-03f6807a6846\") " Oct 02 09:44:56 crc kubenswrapper[4722]: I1002 09:44:56.285660 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6070617-4dba-40fb-a6a8-03f6807a6846-client-ca" (OuterVolumeSpecName: "client-ca") pod "e6070617-4dba-40fb-a6a8-03f6807a6846" (UID: "e6070617-4dba-40fb-a6a8-03f6807a6846"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:44:56 crc kubenswrapper[4722]: I1002 09:44:56.285815 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6070617-4dba-40fb-a6a8-03f6807a6846-config" (OuterVolumeSpecName: "config") pod "e6070617-4dba-40fb-a6a8-03f6807a6846" (UID: "e6070617-4dba-40fb-a6a8-03f6807a6846"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:44:56 crc kubenswrapper[4722]: I1002 09:44:56.286003 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6070617-4dba-40fb-a6a8-03f6807a6846-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e6070617-4dba-40fb-a6a8-03f6807a6846" (UID: "e6070617-4dba-40fb-a6a8-03f6807a6846"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:44:56 crc kubenswrapper[4722]: I1002 09:44:56.291517 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6070617-4dba-40fb-a6a8-03f6807a6846-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e6070617-4dba-40fb-a6a8-03f6807a6846" (UID: "e6070617-4dba-40fb-a6a8-03f6807a6846"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:44:56 crc kubenswrapper[4722]: I1002 09:44:56.295039 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6070617-4dba-40fb-a6a8-03f6807a6846-kube-api-access-g2gv4" (OuterVolumeSpecName: "kube-api-access-g2gv4") pod "e6070617-4dba-40fb-a6a8-03f6807a6846" (UID: "e6070617-4dba-40fb-a6a8-03f6807a6846"). InnerVolumeSpecName "kube-api-access-g2gv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:44:56 crc kubenswrapper[4722]: I1002 09:44:56.386491 4722 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e6070617-4dba-40fb-a6a8-03f6807a6846-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 02 09:44:56 crc kubenswrapper[4722]: I1002 09:44:56.386537 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6070617-4dba-40fb-a6a8-03f6807a6846-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 09:44:56 crc kubenswrapper[4722]: I1002 09:44:56.386546 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2gv4\" (UniqueName: \"kubernetes.io/projected/e6070617-4dba-40fb-a6a8-03f6807a6846-kube-api-access-g2gv4\") on node \"crc\" DevicePath \"\"" Oct 02 09:44:56 crc kubenswrapper[4722]: I1002 09:44:56.386557 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6070617-4dba-40fb-a6a8-03f6807a6846-config\") on node \"crc\" DevicePath \"\"" Oct 02 09:44:56 crc kubenswrapper[4722]: I1002 09:44:56.386569 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e6070617-4dba-40fb-a6a8-03f6807a6846-client-ca\") on node \"crc\" DevicePath \"\"" Oct 02 09:44:56 crc kubenswrapper[4722]: I1002 09:44:56.534858 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zk6t7"] Oct 02 09:44:56 crc kubenswrapper[4722]: I1002 09:44:56.537937 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zk6t7"] Oct 02 09:44:56 crc kubenswrapper[4722]: I1002 09:44:56.654582 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pf6dm" Oct 02 09:44:56 crc kubenswrapper[4722]: I1002 09:44:56.716985 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6070617-4dba-40fb-a6a8-03f6807a6846" path="/var/lib/kubelet/pods/e6070617-4dba-40fb-a6a8-03f6807a6846/volumes" Oct 02 09:44:56 crc kubenswrapper[4722]: I1002 09:44:56.791802 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4104d0d5-52a2-47da-9cd0-86d4c52bc122-client-ca\") pod \"4104d0d5-52a2-47da-9cd0-86d4c52bc122\" (UID: \"4104d0d5-52a2-47da-9cd0-86d4c52bc122\") " Oct 02 09:44:56 crc kubenswrapper[4722]: I1002 09:44:56.791925 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4104d0d5-52a2-47da-9cd0-86d4c52bc122-config\") pod \"4104d0d5-52a2-47da-9cd0-86d4c52bc122\" (UID: \"4104d0d5-52a2-47da-9cd0-86d4c52bc122\") " Oct 02 09:44:56 crc kubenswrapper[4722]: I1002 09:44:56.791980 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fghnh\" (UniqueName: \"kubernetes.io/projected/4104d0d5-52a2-47da-9cd0-86d4c52bc122-kube-api-access-fghnh\") pod \"4104d0d5-52a2-47da-9cd0-86d4c52bc122\" (UID: \"4104d0d5-52a2-47da-9cd0-86d4c52bc122\") " Oct 02 09:44:56 crc kubenswrapper[4722]: I1002 09:44:56.792035 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4104d0d5-52a2-47da-9cd0-86d4c52bc122-serving-cert\") pod \"4104d0d5-52a2-47da-9cd0-86d4c52bc122\" (UID: \"4104d0d5-52a2-47da-9cd0-86d4c52bc122\") " Oct 02 09:44:56 crc kubenswrapper[4722]: I1002 09:44:56.793809 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4104d0d5-52a2-47da-9cd0-86d4c52bc122-client-ca" (OuterVolumeSpecName: "client-ca") pod "4104d0d5-52a2-47da-9cd0-86d4c52bc122" (UID: "4104d0d5-52a2-47da-9cd0-86d4c52bc122"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:44:56 crc kubenswrapper[4722]: I1002 09:44:56.793912 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4104d0d5-52a2-47da-9cd0-86d4c52bc122-config" (OuterVolumeSpecName: "config") pod "4104d0d5-52a2-47da-9cd0-86d4c52bc122" (UID: "4104d0d5-52a2-47da-9cd0-86d4c52bc122"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:44:56 crc kubenswrapper[4722]: I1002 09:44:56.797038 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4104d0d5-52a2-47da-9cd0-86d4c52bc122-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4104d0d5-52a2-47da-9cd0-86d4c52bc122" (UID: "4104d0d5-52a2-47da-9cd0-86d4c52bc122"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:44:56 crc kubenswrapper[4722]: I1002 09:44:56.797651 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4104d0d5-52a2-47da-9cd0-86d4c52bc122-kube-api-access-fghnh" (OuterVolumeSpecName: "kube-api-access-fghnh") pod "4104d0d5-52a2-47da-9cd0-86d4c52bc122" (UID: "4104d0d5-52a2-47da-9cd0-86d4c52bc122"). InnerVolumeSpecName "kube-api-access-fghnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:44:56 crc kubenswrapper[4722]: I1002 09:44:56.893435 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fghnh\" (UniqueName: \"kubernetes.io/projected/4104d0d5-52a2-47da-9cd0-86d4c52bc122-kube-api-access-fghnh\") on node \"crc\" DevicePath \"\"" Oct 02 09:44:56 crc kubenswrapper[4722]: I1002 09:44:56.893480 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4104d0d5-52a2-47da-9cd0-86d4c52bc122-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 09:44:56 crc kubenswrapper[4722]: I1002 09:44:56.893492 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4104d0d5-52a2-47da-9cd0-86d4c52bc122-client-ca\") on node \"crc\" DevicePath \"\"" Oct 02 09:44:56 crc kubenswrapper[4722]: I1002 09:44:56.893503 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4104d0d5-52a2-47da-9cd0-86d4c52bc122-config\") on node \"crc\" DevicePath \"\"" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.196624 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pf6dm" event={"ID":"4104d0d5-52a2-47da-9cd0-86d4c52bc122","Type":"ContainerDied","Data":"6a36b54e551113d0dcd7682bb6aa084a99cbe0acd2bb7e9dda37f7a5046b225f"} Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.196689 4722 scope.go:117] "RemoveContainer" containerID="f0d0a6890e32e6337bf39f4e43cfa0b6f045e101f7a3f7698da8337461d2c197" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.196647 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pf6dm" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.224203 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-pf6dm"] Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.226712 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-pf6dm"] Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.598398 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-78fd575467-vnbwm"] Oct 02 09:44:57 crc kubenswrapper[4722]: E1002 09:44:57.598681 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4104d0d5-52a2-47da-9cd0-86d4c52bc122" containerName="route-controller-manager" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.598695 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4104d0d5-52a2-47da-9cd0-86d4c52bc122" containerName="route-controller-manager" Oct 02 09:44:57 crc kubenswrapper[4722]: E1002 09:44:57.598706 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6070617-4dba-40fb-a6a8-03f6807a6846" containerName="controller-manager" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.598713 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6070617-4dba-40fb-a6a8-03f6807a6846" containerName="controller-manager" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.599004 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6070617-4dba-40fb-a6a8-03f6807a6846" containerName="controller-manager" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.599018 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="4104d0d5-52a2-47da-9cd0-86d4c52bc122" containerName="route-controller-manager" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.599517 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78fd575467-vnbwm" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.602084 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5566f95bdb-n8bvz"] Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.603101 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.603453 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5566f95bdb-n8bvz" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.604187 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.604304 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.604304 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.604986 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.605045 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.608221 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.608621 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.609182 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.609701 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.609797 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.614935 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.619682 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5566f95bdb-n8bvz"] Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.622408 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.624607 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-78fd575467-vnbwm"] Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.701019 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h2km\" (UniqueName: \"kubernetes.io/projected/535a9337-4944-40a6-b46a-85f1459556e0-kube-api-access-2h2km\") pod \"controller-manager-78fd575467-vnbwm\" (UID: \"535a9337-4944-40a6-b46a-85f1459556e0\") " pod="openshift-controller-manager/controller-manager-78fd575467-vnbwm" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.701179 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/447f5397-ee30-45fb-ba84-349a2006f5f3-config\") pod \"route-controller-manager-5566f95bdb-n8bvz\" (UID: \"447f5397-ee30-45fb-ba84-349a2006f5f3\") " pod="openshift-route-controller-manager/route-controller-manager-5566f95bdb-n8bvz" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.701209 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/535a9337-4944-40a6-b46a-85f1459556e0-config\") pod \"controller-manager-78fd575467-vnbwm\" (UID: \"535a9337-4944-40a6-b46a-85f1459556e0\") " pod="openshift-controller-manager/controller-manager-78fd575467-vnbwm" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.701227 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9bkk\" (UniqueName: \"kubernetes.io/projected/447f5397-ee30-45fb-ba84-349a2006f5f3-kube-api-access-r9bkk\") pod \"route-controller-manager-5566f95bdb-n8bvz\" (UID: \"447f5397-ee30-45fb-ba84-349a2006f5f3\") " pod="openshift-route-controller-manager/route-controller-manager-5566f95bdb-n8bvz" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.701254 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/447f5397-ee30-45fb-ba84-349a2006f5f3-client-ca\") pod \"route-controller-manager-5566f95bdb-n8bvz\" (UID: \"447f5397-ee30-45fb-ba84-349a2006f5f3\") " pod="openshift-route-controller-manager/route-controller-manager-5566f95bdb-n8bvz" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.701276 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/447f5397-ee30-45fb-ba84-349a2006f5f3-serving-cert\") pod \"route-controller-manager-5566f95bdb-n8bvz\" (UID: \"447f5397-ee30-45fb-ba84-349a2006f5f3\") " pod="openshift-route-controller-manager/route-controller-manager-5566f95bdb-n8bvz" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.701294 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/535a9337-4944-40a6-b46a-85f1459556e0-serving-cert\") pod \"controller-manager-78fd575467-vnbwm\" (UID: \"535a9337-4944-40a6-b46a-85f1459556e0\") " pod="openshift-controller-manager/controller-manager-78fd575467-vnbwm" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.701333 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/535a9337-4944-40a6-b46a-85f1459556e0-proxy-ca-bundles\") pod \"controller-manager-78fd575467-vnbwm\" (UID: \"535a9337-4944-40a6-b46a-85f1459556e0\") " pod="openshift-controller-manager/controller-manager-78fd575467-vnbwm" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.701353 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/535a9337-4944-40a6-b46a-85f1459556e0-client-ca\") pod \"controller-manager-78fd575467-vnbwm\" (UID: \"535a9337-4944-40a6-b46a-85f1459556e0\") " pod="openshift-controller-manager/controller-manager-78fd575467-vnbwm" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.802787 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/447f5397-ee30-45fb-ba84-349a2006f5f3-config\") pod \"route-controller-manager-5566f95bdb-n8bvz\" (UID: \"447f5397-ee30-45fb-ba84-349a2006f5f3\") " pod="openshift-route-controller-manager/route-controller-manager-5566f95bdb-n8bvz" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.802893 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/535a9337-4944-40a6-b46a-85f1459556e0-config\") pod \"controller-manager-78fd575467-vnbwm\" (UID: \"535a9337-4944-40a6-b46a-85f1459556e0\") " pod="openshift-controller-manager/controller-manager-78fd575467-vnbwm" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.802933 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9bkk\" (UniqueName: \"kubernetes.io/projected/447f5397-ee30-45fb-ba84-349a2006f5f3-kube-api-access-r9bkk\") pod \"route-controller-manager-5566f95bdb-n8bvz\" (UID: \"447f5397-ee30-45fb-ba84-349a2006f5f3\") " pod="openshift-route-controller-manager/route-controller-manager-5566f95bdb-n8bvz" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.802968 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/447f5397-ee30-45fb-ba84-349a2006f5f3-client-ca\") pod \"route-controller-manager-5566f95bdb-n8bvz\" (UID: \"447f5397-ee30-45fb-ba84-349a2006f5f3\") " pod="openshift-route-controller-manager/route-controller-manager-5566f95bdb-n8bvz" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.802993 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/447f5397-ee30-45fb-ba84-349a2006f5f3-serving-cert\") pod \"route-controller-manager-5566f95bdb-n8bvz\" (UID: \"447f5397-ee30-45fb-ba84-349a2006f5f3\") " pod="openshift-route-controller-manager/route-controller-manager-5566f95bdb-n8bvz" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.803007 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/535a9337-4944-40a6-b46a-85f1459556e0-serving-cert\") pod \"controller-manager-78fd575467-vnbwm\" (UID: \"535a9337-4944-40a6-b46a-85f1459556e0\") " pod="openshift-controller-manager/controller-manager-78fd575467-vnbwm" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.803070 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/535a9337-4944-40a6-b46a-85f1459556e0-proxy-ca-bundles\") pod \"controller-manager-78fd575467-vnbwm\" (UID: \"535a9337-4944-40a6-b46a-85f1459556e0\") " pod="openshift-controller-manager/controller-manager-78fd575467-vnbwm" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.803106 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/535a9337-4944-40a6-b46a-85f1459556e0-client-ca\") pod \"controller-manager-78fd575467-vnbwm\" (UID: \"535a9337-4944-40a6-b46a-85f1459556e0\") " pod="openshift-controller-manager/controller-manager-78fd575467-vnbwm" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.803153 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h2km\" (UniqueName: \"kubernetes.io/projected/535a9337-4944-40a6-b46a-85f1459556e0-kube-api-access-2h2km\") pod \"controller-manager-78fd575467-vnbwm\" (UID: \"535a9337-4944-40a6-b46a-85f1459556e0\") " pod="openshift-controller-manager/controller-manager-78fd575467-vnbwm" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.804293 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/447f5397-ee30-45fb-ba84-349a2006f5f3-client-ca\") pod \"route-controller-manager-5566f95bdb-n8bvz\" (UID: \"447f5397-ee30-45fb-ba84-349a2006f5f3\") " pod="openshift-route-controller-manager/route-controller-manager-5566f95bdb-n8bvz" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.805110 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/535a9337-4944-40a6-b46a-85f1459556e0-client-ca\") pod \"controller-manager-78fd575467-vnbwm\" (UID: \"535a9337-4944-40a6-b46a-85f1459556e0\") " pod="openshift-controller-manager/controller-manager-78fd575467-vnbwm" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.805513 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/535a9337-4944-40a6-b46a-85f1459556e0-config\") pod \"controller-manager-78fd575467-vnbwm\" (UID: \"535a9337-4944-40a6-b46a-85f1459556e0\") " pod="openshift-controller-manager/controller-manager-78fd575467-vnbwm" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.806474 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/535a9337-4944-40a6-b46a-85f1459556e0-proxy-ca-bundles\") pod \"controller-manager-78fd575467-vnbwm\" (UID: \"535a9337-4944-40a6-b46a-85f1459556e0\") " pod="openshift-controller-manager/controller-manager-78fd575467-vnbwm" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.808834 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/447f5397-ee30-45fb-ba84-349a2006f5f3-serving-cert\") pod \"route-controller-manager-5566f95bdb-n8bvz\" (UID: \"447f5397-ee30-45fb-ba84-349a2006f5f3\") " pod="openshift-route-controller-manager/route-controller-manager-5566f95bdb-n8bvz" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.809944 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/535a9337-4944-40a6-b46a-85f1459556e0-serving-cert\") pod \"controller-manager-78fd575467-vnbwm\" (UID: \"535a9337-4944-40a6-b46a-85f1459556e0\") " pod="openshift-controller-manager/controller-manager-78fd575467-vnbwm" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.818229 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/447f5397-ee30-45fb-ba84-349a2006f5f3-config\") pod \"route-controller-manager-5566f95bdb-n8bvz\" (UID: \"447f5397-ee30-45fb-ba84-349a2006f5f3\") " pod="openshift-route-controller-manager/route-controller-manager-5566f95bdb-n8bvz" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.822930 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9bkk\" (UniqueName: \"kubernetes.io/projected/447f5397-ee30-45fb-ba84-349a2006f5f3-kube-api-access-r9bkk\") pod \"route-controller-manager-5566f95bdb-n8bvz\" (UID: \"447f5397-ee30-45fb-ba84-349a2006f5f3\") " pod="openshift-route-controller-manager/route-controller-manager-5566f95bdb-n8bvz" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.825967 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h2km\" (UniqueName: \"kubernetes.io/projected/535a9337-4944-40a6-b46a-85f1459556e0-kube-api-access-2h2km\") pod \"controller-manager-78fd575467-vnbwm\" (UID: \"535a9337-4944-40a6-b46a-85f1459556e0\") " pod="openshift-controller-manager/controller-manager-78fd575467-vnbwm" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.921582 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78fd575467-vnbwm" Oct 02 09:44:57 crc kubenswrapper[4722]: I1002 09:44:57.935768 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5566f95bdb-n8bvz" Oct 02 09:44:58 crc kubenswrapper[4722]: I1002 09:44:58.174490 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-78fd575467-vnbwm"] Oct 02 09:44:58 crc kubenswrapper[4722]: I1002 09:44:58.214984 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78fd575467-vnbwm" event={"ID":"535a9337-4944-40a6-b46a-85f1459556e0","Type":"ContainerStarted","Data":"8ae0067aafc71acdb39c54f5d9aa7b562baee42d3fda244ef5cd7ab0eb33be5c"} Oct 02 09:44:58 crc kubenswrapper[4722]: I1002 09:44:58.227919 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5566f95bdb-n8bvz"] Oct 02 09:44:58 crc kubenswrapper[4722]: I1002 09:44:58.717804 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4104d0d5-52a2-47da-9cd0-86d4c52bc122" path="/var/lib/kubelet/pods/4104d0d5-52a2-47da-9cd0-86d4c52bc122/volumes" Oct 02 09:44:59 crc kubenswrapper[4722]: I1002 09:44:59.223038 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5566f95bdb-n8bvz" event={"ID":"447f5397-ee30-45fb-ba84-349a2006f5f3","Type":"ContainerStarted","Data":"8dede2e02ee592273b0e3ecc06f5d986b733eb9c127f8e5a6f8b7390abf7878f"} Oct 02 09:44:59 crc kubenswrapper[4722]: I1002 09:44:59.223086 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5566f95bdb-n8bvz" event={"ID":"447f5397-ee30-45fb-ba84-349a2006f5f3","Type":"ContainerStarted","Data":"81c98bcd4645ef40278c84627557f8d76b13452324481e7c3db313912acf95df"} Oct 02 09:44:59 crc kubenswrapper[4722]: I1002 09:44:59.223110 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5566f95bdb-n8bvz" Oct 02 09:44:59 crc kubenswrapper[4722]: I1002 09:44:59.226389 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78fd575467-vnbwm" event={"ID":"535a9337-4944-40a6-b46a-85f1459556e0","Type":"ContainerStarted","Data":"2e267ed64e774c7ee8ca6c8c76e0929fbc209bb0503a23c1621ff6ef13ae2ec4"} Oct 02 09:44:59 crc kubenswrapper[4722]: I1002 09:44:59.226717 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-78fd575467-vnbwm" Oct 02 09:44:59 crc kubenswrapper[4722]: I1002 09:44:59.228069 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5566f95bdb-n8bvz" Oct 02 09:44:59 crc kubenswrapper[4722]: I1002 09:44:59.244843 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-78fd575467-vnbwm" Oct 02 09:44:59 crc kubenswrapper[4722]: I1002 09:44:59.245397 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5566f95bdb-n8bvz" podStartSLOduration=4.245308898 podStartE2EDuration="4.245308898s" podCreationTimestamp="2025-10-02 09:44:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:44:59.244011416 +0000 UTC m=+755.280764406" watchObservedRunningTime="2025-10-02 09:44:59.245308898 +0000 UTC m=+755.282061878" Oct 02 09:44:59 crc kubenswrapper[4722]: I1002 09:44:59.293198 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-78fd575467-vnbwm" podStartSLOduration=4.293176351 podStartE2EDuration="4.293176351s" podCreationTimestamp="2025-10-02 09:44:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:44:59.286741341 +0000 UTC m=+755.323494331" watchObservedRunningTime="2025-10-02 09:44:59.293176351 +0000 UTC m=+755.329929331" Oct 02 09:45:00 crc kubenswrapper[4722]: I1002 09:45:00.140165 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323305-tkcgk"] Oct 02 09:45:00 crc kubenswrapper[4722]: I1002 09:45:00.141299 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323305-tkcgk" Oct 02 09:45:00 crc kubenswrapper[4722]: I1002 09:45:00.144297 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 09:45:00 crc kubenswrapper[4722]: I1002 09:45:00.144591 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 09:45:00 crc kubenswrapper[4722]: I1002 09:45:00.148088 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323305-tkcgk"] Oct 02 09:45:00 crc kubenswrapper[4722]: I1002 09:45:00.234249 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2fa93458-4a1d-4542-b50c-118aa3d1b164-config-volume\") pod \"collect-profiles-29323305-tkcgk\" (UID: \"2fa93458-4a1d-4542-b50c-118aa3d1b164\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323305-tkcgk" Oct 02 09:45:00 crc kubenswrapper[4722]: I1002 09:45:00.234300 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxrfs\" (UniqueName: \"kubernetes.io/projected/2fa93458-4a1d-4542-b50c-118aa3d1b164-kube-api-access-qxrfs\") pod \"collect-profiles-29323305-tkcgk\" (UID: \"2fa93458-4a1d-4542-b50c-118aa3d1b164\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323305-tkcgk" Oct 02 09:45:00 crc kubenswrapper[4722]: I1002 09:45:00.234355 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2fa93458-4a1d-4542-b50c-118aa3d1b164-secret-volume\") pod \"collect-profiles-29323305-tkcgk\" (UID: \"2fa93458-4a1d-4542-b50c-118aa3d1b164\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323305-tkcgk" Oct 02 09:45:00 crc kubenswrapper[4722]: I1002 09:45:00.335101 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2fa93458-4a1d-4542-b50c-118aa3d1b164-config-volume\") pod \"collect-profiles-29323305-tkcgk\" (UID: \"2fa93458-4a1d-4542-b50c-118aa3d1b164\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323305-tkcgk" Oct 02 09:45:00 crc kubenswrapper[4722]: I1002 09:45:00.335171 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxrfs\" (UniqueName: \"kubernetes.io/projected/2fa93458-4a1d-4542-b50c-118aa3d1b164-kube-api-access-qxrfs\") pod \"collect-profiles-29323305-tkcgk\" (UID: \"2fa93458-4a1d-4542-b50c-118aa3d1b164\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323305-tkcgk" Oct 02 09:45:00 crc kubenswrapper[4722]: I1002 09:45:00.335203 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2fa93458-4a1d-4542-b50c-118aa3d1b164-secret-volume\") pod \"collect-profiles-29323305-tkcgk\" (UID: \"2fa93458-4a1d-4542-b50c-118aa3d1b164\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323305-tkcgk" Oct 02 09:45:00 crc kubenswrapper[4722]: I1002 09:45:00.336444 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2fa93458-4a1d-4542-b50c-118aa3d1b164-config-volume\") pod \"collect-profiles-29323305-tkcgk\" (UID: \"2fa93458-4a1d-4542-b50c-118aa3d1b164\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323305-tkcgk" Oct 02 09:45:00 crc kubenswrapper[4722]: I1002 09:45:00.344783 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2fa93458-4a1d-4542-b50c-118aa3d1b164-secret-volume\") pod \"collect-profiles-29323305-tkcgk\" (UID: \"2fa93458-4a1d-4542-b50c-118aa3d1b164\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323305-tkcgk" Oct 02 09:45:00 crc kubenswrapper[4722]: I1002 09:45:00.356190 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxrfs\" (UniqueName: \"kubernetes.io/projected/2fa93458-4a1d-4542-b50c-118aa3d1b164-kube-api-access-qxrfs\") pod \"collect-profiles-29323305-tkcgk\" (UID: \"2fa93458-4a1d-4542-b50c-118aa3d1b164\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323305-tkcgk" Oct 02 09:45:00 crc kubenswrapper[4722]: I1002 09:45:00.459553 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323305-tkcgk" Oct 02 09:45:00 crc kubenswrapper[4722]: I1002 09:45:00.855526 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323305-tkcgk"] Oct 02 09:45:00 crc kubenswrapper[4722]: W1002 09:45:00.862473 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fa93458_4a1d_4542_b50c_118aa3d1b164.slice/crio-d7e4422dba4f35c6b57e2cf91aceb321cc2d4c6e1297a941bc55ee079c6fad96 WatchSource:0}: Error finding container d7e4422dba4f35c6b57e2cf91aceb321cc2d4c6e1297a941bc55ee079c6fad96: Status 404 returned error can't find the container with id d7e4422dba4f35c6b57e2cf91aceb321cc2d4c6e1297a941bc55ee079c6fad96 Oct 02 09:45:01 crc kubenswrapper[4722]: I1002 09:45:01.239706 4722 generic.go:334] "Generic (PLEG): container finished" podID="2fa93458-4a1d-4542-b50c-118aa3d1b164" containerID="14769dda0273cee74462ee3dd0e8c4bcbb39f5ac8314671c8f1e3386ab7322f2" exitCode=0 Oct 02 09:45:01 crc kubenswrapper[4722]: I1002 09:45:01.239795 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323305-tkcgk" event={"ID":"2fa93458-4a1d-4542-b50c-118aa3d1b164","Type":"ContainerDied","Data":"14769dda0273cee74462ee3dd0e8c4bcbb39f5ac8314671c8f1e3386ab7322f2"} Oct 02 09:45:01 crc kubenswrapper[4722]: I1002 09:45:01.240105 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323305-tkcgk" event={"ID":"2fa93458-4a1d-4542-b50c-118aa3d1b164","Type":"ContainerStarted","Data":"d7e4422dba4f35c6b57e2cf91aceb321cc2d4c6e1297a941bc55ee079c6fad96"} Oct 02 09:45:02 crc kubenswrapper[4722]: I1002 09:45:02.537978 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323305-tkcgk" Oct 02 09:45:02 crc kubenswrapper[4722]: I1002 09:45:02.668194 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxrfs\" (UniqueName: \"kubernetes.io/projected/2fa93458-4a1d-4542-b50c-118aa3d1b164-kube-api-access-qxrfs\") pod \"2fa93458-4a1d-4542-b50c-118aa3d1b164\" (UID: \"2fa93458-4a1d-4542-b50c-118aa3d1b164\") " Oct 02 09:45:02 crc kubenswrapper[4722]: I1002 09:45:02.668270 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2fa93458-4a1d-4542-b50c-118aa3d1b164-config-volume\") pod \"2fa93458-4a1d-4542-b50c-118aa3d1b164\" (UID: \"2fa93458-4a1d-4542-b50c-118aa3d1b164\") " Oct 02 09:45:02 crc kubenswrapper[4722]: I1002 09:45:02.668348 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2fa93458-4a1d-4542-b50c-118aa3d1b164-secret-volume\") pod \"2fa93458-4a1d-4542-b50c-118aa3d1b164\" (UID: \"2fa93458-4a1d-4542-b50c-118aa3d1b164\") " Oct 02 09:45:02 crc kubenswrapper[4722]: I1002 09:45:02.669285 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fa93458-4a1d-4542-b50c-118aa3d1b164-config-volume" (OuterVolumeSpecName: "config-volume") pod "2fa93458-4a1d-4542-b50c-118aa3d1b164" (UID: "2fa93458-4a1d-4542-b50c-118aa3d1b164"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:45:02 crc kubenswrapper[4722]: I1002 09:45:02.680563 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fa93458-4a1d-4542-b50c-118aa3d1b164-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2fa93458-4a1d-4542-b50c-118aa3d1b164" (UID: "2fa93458-4a1d-4542-b50c-118aa3d1b164"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:45:02 crc kubenswrapper[4722]: I1002 09:45:02.680593 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fa93458-4a1d-4542-b50c-118aa3d1b164-kube-api-access-qxrfs" (OuterVolumeSpecName: "kube-api-access-qxrfs") pod "2fa93458-4a1d-4542-b50c-118aa3d1b164" (UID: "2fa93458-4a1d-4542-b50c-118aa3d1b164"). InnerVolumeSpecName "kube-api-access-qxrfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:45:02 crc kubenswrapper[4722]: I1002 09:45:02.769376 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxrfs\" (UniqueName: \"kubernetes.io/projected/2fa93458-4a1d-4542-b50c-118aa3d1b164-kube-api-access-qxrfs\") on node \"crc\" DevicePath \"\"" Oct 02 09:45:02 crc kubenswrapper[4722]: I1002 09:45:02.769911 4722 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2fa93458-4a1d-4542-b50c-118aa3d1b164-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 09:45:02 crc kubenswrapper[4722]: I1002 09:45:02.769924 4722 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2fa93458-4a1d-4542-b50c-118aa3d1b164-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 09:45:02 crc kubenswrapper[4722]: E1002 09:45:02.818745 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fa93458_4a1d_4542_b50c_118aa3d1b164.slice\": RecentStats: unable to find data in memory cache]" Oct 02 09:45:02 crc kubenswrapper[4722]: I1002 09:45:02.947820 4722 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 02 09:45:03 crc kubenswrapper[4722]: I1002 09:45:03.252793 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323305-tkcgk" event={"ID":"2fa93458-4a1d-4542-b50c-118aa3d1b164","Type":"ContainerDied","Data":"d7e4422dba4f35c6b57e2cf91aceb321cc2d4c6e1297a941bc55ee079c6fad96"} Oct 02 09:45:03 crc kubenswrapper[4722]: I1002 09:45:03.252839 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7e4422dba4f35c6b57e2cf91aceb321cc2d4c6e1297a941bc55ee079c6fad96" Oct 02 09:45:03 crc kubenswrapper[4722]: I1002 09:45:03.252841 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323305-tkcgk" Oct 02 09:45:07 crc kubenswrapper[4722]: I1002 09:45:07.899036 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qhg8v"] Oct 02 09:45:07 crc kubenswrapper[4722]: E1002 09:45:07.900728 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fa93458-4a1d-4542-b50c-118aa3d1b164" containerName="collect-profiles" Oct 02 09:45:07 crc kubenswrapper[4722]: I1002 09:45:07.900754 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fa93458-4a1d-4542-b50c-118aa3d1b164" containerName="collect-profiles" Oct 02 09:45:07 crc kubenswrapper[4722]: I1002 09:45:07.900878 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fa93458-4a1d-4542-b50c-118aa3d1b164" containerName="collect-profiles" Oct 02 09:45:07 crc kubenswrapper[4722]: I1002 09:45:07.901969 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qhg8v" Oct 02 09:45:07 crc kubenswrapper[4722]: I1002 09:45:07.904048 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 02 09:45:07 crc kubenswrapper[4722]: I1002 09:45:07.911908 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qhg8v"] Oct 02 09:45:08 crc kubenswrapper[4722]: I1002 09:45:08.044390 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6986\" (UniqueName: \"kubernetes.io/projected/98e4ffb9-0cc0-4196-8f18-08f3a5827133-kube-api-access-p6986\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qhg8v\" (UID: \"98e4ffb9-0cc0-4196-8f18-08f3a5827133\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qhg8v" Oct 02 09:45:08 crc kubenswrapper[4722]: I1002 09:45:08.044541 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/98e4ffb9-0cc0-4196-8f18-08f3a5827133-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qhg8v\" (UID: \"98e4ffb9-0cc0-4196-8f18-08f3a5827133\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qhg8v" Oct 02 09:45:08 crc kubenswrapper[4722]: I1002 09:45:08.044621 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/98e4ffb9-0cc0-4196-8f18-08f3a5827133-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qhg8v\" (UID: \"98e4ffb9-0cc0-4196-8f18-08f3a5827133\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qhg8v" Oct 02 09:45:08 crc kubenswrapper[4722]: I1002 09:45:08.146120 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/98e4ffb9-0cc0-4196-8f18-08f3a5827133-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qhg8v\" (UID: \"98e4ffb9-0cc0-4196-8f18-08f3a5827133\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qhg8v" Oct 02 09:45:08 crc kubenswrapper[4722]: I1002 09:45:08.146207 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/98e4ffb9-0cc0-4196-8f18-08f3a5827133-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qhg8v\" (UID: \"98e4ffb9-0cc0-4196-8f18-08f3a5827133\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qhg8v" Oct 02 09:45:08 crc kubenswrapper[4722]: I1002 09:45:08.146283 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6986\" (UniqueName: \"kubernetes.io/projected/98e4ffb9-0cc0-4196-8f18-08f3a5827133-kube-api-access-p6986\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qhg8v\" (UID: \"98e4ffb9-0cc0-4196-8f18-08f3a5827133\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qhg8v" Oct 02 09:45:08 crc kubenswrapper[4722]: I1002 09:45:08.146735 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/98e4ffb9-0cc0-4196-8f18-08f3a5827133-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qhg8v\" (UID: \"98e4ffb9-0cc0-4196-8f18-08f3a5827133\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qhg8v" Oct 02 09:45:08 crc kubenswrapper[4722]: I1002 09:45:08.146820 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/98e4ffb9-0cc0-4196-8f18-08f3a5827133-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qhg8v\" (UID: \"98e4ffb9-0cc0-4196-8f18-08f3a5827133\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qhg8v" Oct 02 09:45:08 crc kubenswrapper[4722]: I1002 09:45:08.169920 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6986\" (UniqueName: \"kubernetes.io/projected/98e4ffb9-0cc0-4196-8f18-08f3a5827133-kube-api-access-p6986\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qhg8v\" (UID: \"98e4ffb9-0cc0-4196-8f18-08f3a5827133\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qhg8v" Oct 02 09:45:08 crc kubenswrapper[4722]: I1002 09:45:08.223215 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qhg8v" Oct 02 09:45:08 crc kubenswrapper[4722]: I1002 09:45:08.620728 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qhg8v"] Oct 02 09:45:08 crc kubenswrapper[4722]: W1002 09:45:08.634530 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98e4ffb9_0cc0_4196_8f18_08f3a5827133.slice/crio-e429144cd60bf3173dcd03805ddfc1da73d222f5ea0fc71363fa8fef0a54fcfe WatchSource:0}: Error finding container e429144cd60bf3173dcd03805ddfc1da73d222f5ea0fc71363fa8fef0a54fcfe: Status 404 returned error can't find the container with id e429144cd60bf3173dcd03805ddfc1da73d222f5ea0fc71363fa8fef0a54fcfe Oct 02 09:45:09 crc kubenswrapper[4722]: I1002 09:45:09.282923 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qhg8v" event={"ID":"98e4ffb9-0cc0-4196-8f18-08f3a5827133","Type":"ContainerStarted","Data":"64ee235d3b577ba7f4d87a027231388d5058e71ca95de49fccc9e3a7805d9a91"} Oct 02 09:45:09 crc kubenswrapper[4722]: I1002 09:45:09.283412 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qhg8v" event={"ID":"98e4ffb9-0cc0-4196-8f18-08f3a5827133","Type":"ContainerStarted","Data":"e429144cd60bf3173dcd03805ddfc1da73d222f5ea0fc71363fa8fef0a54fcfe"} Oct 02 09:45:10 crc kubenswrapper[4722]: I1002 09:45:10.269109 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rzghp"] Oct 02 09:45:10 crc kubenswrapper[4722]: I1002 09:45:10.270615 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rzghp" Oct 02 09:45:10 crc kubenswrapper[4722]: I1002 09:45:10.284815 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rzghp"] Oct 02 09:45:10 crc kubenswrapper[4722]: I1002 09:45:10.295308 4722 generic.go:334] "Generic (PLEG): container finished" podID="98e4ffb9-0cc0-4196-8f18-08f3a5827133" containerID="64ee235d3b577ba7f4d87a027231388d5058e71ca95de49fccc9e3a7805d9a91" exitCode=0 Oct 02 09:45:10 crc kubenswrapper[4722]: I1002 09:45:10.295397 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qhg8v" event={"ID":"98e4ffb9-0cc0-4196-8f18-08f3a5827133","Type":"ContainerDied","Data":"64ee235d3b577ba7f4d87a027231388d5058e71ca95de49fccc9e3a7805d9a91"} Oct 02 09:45:10 crc kubenswrapper[4722]: I1002 09:45:10.297463 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 09:45:10 crc kubenswrapper[4722]: I1002 09:45:10.376731 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20e5d97a-c526-45ce-a71b-8ffdcbbed003-catalog-content\") pod \"redhat-operators-rzghp\" (UID: \"20e5d97a-c526-45ce-a71b-8ffdcbbed003\") " pod="openshift-marketplace/redhat-operators-rzghp" Oct 02 09:45:10 crc kubenswrapper[4722]: I1002 09:45:10.376794 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpxpk\" (UniqueName: \"kubernetes.io/projected/20e5d97a-c526-45ce-a71b-8ffdcbbed003-kube-api-access-vpxpk\") pod \"redhat-operators-rzghp\" (UID: \"20e5d97a-c526-45ce-a71b-8ffdcbbed003\") " pod="openshift-marketplace/redhat-operators-rzghp" Oct 02 09:45:10 crc kubenswrapper[4722]: I1002 09:45:10.376928 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20e5d97a-c526-45ce-a71b-8ffdcbbed003-utilities\") pod \"redhat-operators-rzghp\" (UID: \"20e5d97a-c526-45ce-a71b-8ffdcbbed003\") " pod="openshift-marketplace/redhat-operators-rzghp" Oct 02 09:45:10 crc kubenswrapper[4722]: I1002 09:45:10.478573 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20e5d97a-c526-45ce-a71b-8ffdcbbed003-utilities\") pod \"redhat-operators-rzghp\" (UID: \"20e5d97a-c526-45ce-a71b-8ffdcbbed003\") " pod="openshift-marketplace/redhat-operators-rzghp" Oct 02 09:45:10 crc kubenswrapper[4722]: I1002 09:45:10.478652 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20e5d97a-c526-45ce-a71b-8ffdcbbed003-catalog-content\") pod \"redhat-operators-rzghp\" (UID: \"20e5d97a-c526-45ce-a71b-8ffdcbbed003\") " pod="openshift-marketplace/redhat-operators-rzghp" Oct 02 09:45:10 crc kubenswrapper[4722]: I1002 09:45:10.478671 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpxpk\" (UniqueName: \"kubernetes.io/projected/20e5d97a-c526-45ce-a71b-8ffdcbbed003-kube-api-access-vpxpk\") pod \"redhat-operators-rzghp\" (UID: \"20e5d97a-c526-45ce-a71b-8ffdcbbed003\") " pod="openshift-marketplace/redhat-operators-rzghp" Oct 02 09:45:10 crc kubenswrapper[4722]: I1002 09:45:10.479419 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20e5d97a-c526-45ce-a71b-8ffdcbbed003-catalog-content\") pod \"redhat-operators-rzghp\" (UID: \"20e5d97a-c526-45ce-a71b-8ffdcbbed003\") " pod="openshift-marketplace/redhat-operators-rzghp" Oct 02 09:45:10 crc kubenswrapper[4722]: I1002 09:45:10.479420 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20e5d97a-c526-45ce-a71b-8ffdcbbed003-utilities\") pod \"redhat-operators-rzghp\" (UID: \"20e5d97a-c526-45ce-a71b-8ffdcbbed003\") " pod="openshift-marketplace/redhat-operators-rzghp" Oct 02 09:45:10 crc kubenswrapper[4722]: I1002 09:45:10.500204 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpxpk\" (UniqueName: \"kubernetes.io/projected/20e5d97a-c526-45ce-a71b-8ffdcbbed003-kube-api-access-vpxpk\") pod \"redhat-operators-rzghp\" (UID: \"20e5d97a-c526-45ce-a71b-8ffdcbbed003\") " pod="openshift-marketplace/redhat-operators-rzghp" Oct 02 09:45:10 crc kubenswrapper[4722]: I1002 09:45:10.588192 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rzghp" Oct 02 09:45:10 crc kubenswrapper[4722]: I1002 09:45:10.696770 4722 patch_prober.go:28] interesting pod/machine-config-daemon-q6h76 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 09:45:10 crc kubenswrapper[4722]: I1002 09:45:10.697230 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" podUID="f662826c-e772-4f56-a85f-83971aaef356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 09:45:11 crc kubenswrapper[4722]: I1002 09:45:11.012162 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rzghp"] Oct 02 09:45:11 crc kubenswrapper[4722]: W1002 09:45:11.020686 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20e5d97a_c526_45ce_a71b_8ffdcbbed003.slice/crio-5bfe30ca6e17c1aa596fb8bb59632eecef618a4942b6f97989c807749da47918 WatchSource:0}: Error finding container 5bfe30ca6e17c1aa596fb8bb59632eecef618a4942b6f97989c807749da47918: Status 404 returned error can't find the container with id 5bfe30ca6e17c1aa596fb8bb59632eecef618a4942b6f97989c807749da47918 Oct 02 09:45:11 crc kubenswrapper[4722]: I1002 09:45:11.302801 4722 generic.go:334] "Generic (PLEG): container finished" podID="20e5d97a-c526-45ce-a71b-8ffdcbbed003" containerID="5ffea75b8e58aa7ea94bea94762735bc56417cc39ade1bf2d8ab0d6b9d999058" exitCode=0 Oct 02 09:45:11 crc kubenswrapper[4722]: I1002 09:45:11.302909 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzghp" event={"ID":"20e5d97a-c526-45ce-a71b-8ffdcbbed003","Type":"ContainerDied","Data":"5ffea75b8e58aa7ea94bea94762735bc56417cc39ade1bf2d8ab0d6b9d999058"} Oct 02 09:45:11 crc kubenswrapper[4722]: I1002 09:45:11.303199 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzghp" event={"ID":"20e5d97a-c526-45ce-a71b-8ffdcbbed003","Type":"ContainerStarted","Data":"5bfe30ca6e17c1aa596fb8bb59632eecef618a4942b6f97989c807749da47918"} Oct 02 09:45:12 crc kubenswrapper[4722]: I1002 09:45:12.316406 4722 generic.go:334] "Generic (PLEG): container finished" podID="98e4ffb9-0cc0-4196-8f18-08f3a5827133" containerID="c945f68f78ca6d0620be62b03736398883641dc9a58d39134e87afad4f44899a" exitCode=0 Oct 02 09:45:12 crc kubenswrapper[4722]: I1002 09:45:12.316485 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qhg8v" event={"ID":"98e4ffb9-0cc0-4196-8f18-08f3a5827133","Type":"ContainerDied","Data":"c945f68f78ca6d0620be62b03736398883641dc9a58d39134e87afad4f44899a"} Oct 02 09:45:13 crc kubenswrapper[4722]: I1002 09:45:13.324186 4722 generic.go:334] "Generic (PLEG): container finished" podID="20e5d97a-c526-45ce-a71b-8ffdcbbed003" containerID="748e063f0bfa7e11d9a275b813b3c75e8525d9a0dc103f1deab0091c44b052f1" exitCode=0 Oct 02 09:45:13 crc kubenswrapper[4722]: I1002 09:45:13.324234 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzghp" event={"ID":"20e5d97a-c526-45ce-a71b-8ffdcbbed003","Type":"ContainerDied","Data":"748e063f0bfa7e11d9a275b813b3c75e8525d9a0dc103f1deab0091c44b052f1"} Oct 02 09:45:13 crc kubenswrapper[4722]: I1002 09:45:13.334059 4722 generic.go:334] "Generic (PLEG): container finished" podID="98e4ffb9-0cc0-4196-8f18-08f3a5827133" containerID="68d9887fcd85061ea85f2d859c1734c4a652d7768c47b29892d51f1ad8355ec0" exitCode=0 Oct 02 09:45:13 crc kubenswrapper[4722]: I1002 09:45:13.334106 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qhg8v" event={"ID":"98e4ffb9-0cc0-4196-8f18-08f3a5827133","Type":"ContainerDied","Data":"68d9887fcd85061ea85f2d859c1734c4a652d7768c47b29892d51f1ad8355ec0"} Oct 02 09:45:13 crc kubenswrapper[4722]: I1002 09:45:13.376592 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xnfpf" Oct 02 09:45:14 crc kubenswrapper[4722]: I1002 09:45:14.346501 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzghp" event={"ID":"20e5d97a-c526-45ce-a71b-8ffdcbbed003","Type":"ContainerStarted","Data":"b04c78cd6aba76d8314bdfd70bbaaa8142160db9d59949e46421c93a64fb221e"} Oct 02 09:45:14 crc kubenswrapper[4722]: I1002 09:45:14.376114 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rzghp" podStartSLOduration=1.927334315 podStartE2EDuration="4.376073979s" podCreationTimestamp="2025-10-02 09:45:10 +0000 UTC" firstStartedPulling="2025-10-02 09:45:11.304901224 +0000 UTC m=+767.341654204" lastFinishedPulling="2025-10-02 09:45:13.753640888 +0000 UTC m=+769.790393868" observedRunningTime="2025-10-02 09:45:14.374577172 +0000 UTC m=+770.411330162" watchObservedRunningTime="2025-10-02 09:45:14.376073979 +0000 UTC m=+770.412826969" Oct 02 09:45:14 crc kubenswrapper[4722]: I1002 09:45:14.667221 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qhg8v" Oct 02 09:45:14 crc kubenswrapper[4722]: I1002 09:45:14.744908 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/98e4ffb9-0cc0-4196-8f18-08f3a5827133-bundle\") pod \"98e4ffb9-0cc0-4196-8f18-08f3a5827133\" (UID: \"98e4ffb9-0cc0-4196-8f18-08f3a5827133\") " Oct 02 09:45:14 crc kubenswrapper[4722]: I1002 09:45:14.744947 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/98e4ffb9-0cc0-4196-8f18-08f3a5827133-util\") pod \"98e4ffb9-0cc0-4196-8f18-08f3a5827133\" (UID: \"98e4ffb9-0cc0-4196-8f18-08f3a5827133\") " Oct 02 09:45:14 crc kubenswrapper[4722]: I1002 09:45:14.745033 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6986\" (UniqueName: \"kubernetes.io/projected/98e4ffb9-0cc0-4196-8f18-08f3a5827133-kube-api-access-p6986\") pod \"98e4ffb9-0cc0-4196-8f18-08f3a5827133\" (UID: \"98e4ffb9-0cc0-4196-8f18-08f3a5827133\") " Oct 02 09:45:14 crc kubenswrapper[4722]: I1002 09:45:14.745867 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98e4ffb9-0cc0-4196-8f18-08f3a5827133-bundle" (OuterVolumeSpecName: "bundle") pod "98e4ffb9-0cc0-4196-8f18-08f3a5827133" (UID: "98e4ffb9-0cc0-4196-8f18-08f3a5827133"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:45:14 crc kubenswrapper[4722]: I1002 09:45:14.753457 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98e4ffb9-0cc0-4196-8f18-08f3a5827133-kube-api-access-p6986" (OuterVolumeSpecName: "kube-api-access-p6986") pod "98e4ffb9-0cc0-4196-8f18-08f3a5827133" (UID: "98e4ffb9-0cc0-4196-8f18-08f3a5827133"). InnerVolumeSpecName "kube-api-access-p6986". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:45:14 crc kubenswrapper[4722]: I1002 09:45:14.757678 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98e4ffb9-0cc0-4196-8f18-08f3a5827133-util" (OuterVolumeSpecName: "util") pod "98e4ffb9-0cc0-4196-8f18-08f3a5827133" (UID: "98e4ffb9-0cc0-4196-8f18-08f3a5827133"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:45:14 crc kubenswrapper[4722]: I1002 09:45:14.846309 4722 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/98e4ffb9-0cc0-4196-8f18-08f3a5827133-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 09:45:14 crc kubenswrapper[4722]: I1002 09:45:14.846360 4722 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/98e4ffb9-0cc0-4196-8f18-08f3a5827133-util\") on node \"crc\" DevicePath \"\"" Oct 02 09:45:14 crc kubenswrapper[4722]: I1002 09:45:14.846373 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6986\" (UniqueName: \"kubernetes.io/projected/98e4ffb9-0cc0-4196-8f18-08f3a5827133-kube-api-access-p6986\") on node \"crc\" DevicePath \"\"" Oct 02 09:45:15 crc kubenswrapper[4722]: I1002 09:45:15.357116 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qhg8v" Oct 02 09:45:15 crc kubenswrapper[4722]: I1002 09:45:15.357104 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qhg8v" event={"ID":"98e4ffb9-0cc0-4196-8f18-08f3a5827133","Type":"ContainerDied","Data":"e429144cd60bf3173dcd03805ddfc1da73d222f5ea0fc71363fa8fef0a54fcfe"} Oct 02 09:45:15 crc kubenswrapper[4722]: I1002 09:45:15.357200 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e429144cd60bf3173dcd03805ddfc1da73d222f5ea0fc71363fa8fef0a54fcfe" Oct 02 09:45:20 crc kubenswrapper[4722]: I1002 09:45:20.588944 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rzghp" Oct 02 09:45:20 crc kubenswrapper[4722]: I1002 09:45:20.591363 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rzghp" Oct 02 09:45:20 crc kubenswrapper[4722]: I1002 09:45:20.654587 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rzghp" Oct 02 09:45:21 crc kubenswrapper[4722]: I1002 09:45:21.441657 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rzghp" Oct 02 09:45:22 crc kubenswrapper[4722]: I1002 09:45:22.651360 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rzghp"] Oct 02 09:45:24 crc kubenswrapper[4722]: I1002 09:45:24.408595 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rzghp" podUID="20e5d97a-c526-45ce-a71b-8ffdcbbed003" containerName="registry-server" containerID="cri-o://b04c78cd6aba76d8314bdfd70bbaaa8142160db9d59949e46421c93a64fb221e" gracePeriod=2 Oct 02 09:45:24 crc kubenswrapper[4722]: I1002 09:45:24.903437 4722 scope.go:117] "RemoveContainer" containerID="aae2e270e209af7e6afbf96e995421ccb4fd0aa09be5011bb0458307a8d5788a" Oct 02 09:45:25 crc kubenswrapper[4722]: I1002 09:45:25.434639 4722 generic.go:334] "Generic (PLEG): container finished" podID="20e5d97a-c526-45ce-a71b-8ffdcbbed003" containerID="b04c78cd6aba76d8314bdfd70bbaaa8142160db9d59949e46421c93a64fb221e" exitCode=0 Oct 02 09:45:25 crc kubenswrapper[4722]: I1002 09:45:25.435072 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzghp" event={"ID":"20e5d97a-c526-45ce-a71b-8ffdcbbed003","Type":"ContainerDied","Data":"b04c78cd6aba76d8314bdfd70bbaaa8142160db9d59949e46421c93a64fb221e"} Oct 02 09:45:25 crc kubenswrapper[4722]: I1002 09:45:25.441736 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zwlns_04c84986-75a1-4683-8764-8f7307d1126a/kube-multus/2.log" Oct 02 09:45:25 crc kubenswrapper[4722]: I1002 09:45:25.615509 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rzghp" Oct 02 09:45:25 crc kubenswrapper[4722]: I1002 09:45:25.689512 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20e5d97a-c526-45ce-a71b-8ffdcbbed003-utilities\") pod \"20e5d97a-c526-45ce-a71b-8ffdcbbed003\" (UID: \"20e5d97a-c526-45ce-a71b-8ffdcbbed003\") " Oct 02 09:45:25 crc kubenswrapper[4722]: I1002 09:45:25.689582 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20e5d97a-c526-45ce-a71b-8ffdcbbed003-catalog-content\") pod \"20e5d97a-c526-45ce-a71b-8ffdcbbed003\" (UID: \"20e5d97a-c526-45ce-a71b-8ffdcbbed003\") " Oct 02 09:45:25 crc kubenswrapper[4722]: I1002 09:45:25.689717 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpxpk\" (UniqueName: \"kubernetes.io/projected/20e5d97a-c526-45ce-a71b-8ffdcbbed003-kube-api-access-vpxpk\") pod \"20e5d97a-c526-45ce-a71b-8ffdcbbed003\" (UID: \"20e5d97a-c526-45ce-a71b-8ffdcbbed003\") " Oct 02 09:45:25 crc kubenswrapper[4722]: I1002 09:45:25.690374 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20e5d97a-c526-45ce-a71b-8ffdcbbed003-utilities" (OuterVolumeSpecName: "utilities") pod "20e5d97a-c526-45ce-a71b-8ffdcbbed003" (UID: "20e5d97a-c526-45ce-a71b-8ffdcbbed003"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:45:25 crc kubenswrapper[4722]: I1002 09:45:25.697134 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20e5d97a-c526-45ce-a71b-8ffdcbbed003-kube-api-access-vpxpk" (OuterVolumeSpecName: "kube-api-access-vpxpk") pod "20e5d97a-c526-45ce-a71b-8ffdcbbed003" (UID: "20e5d97a-c526-45ce-a71b-8ffdcbbed003"). InnerVolumeSpecName "kube-api-access-vpxpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:45:25 crc kubenswrapper[4722]: I1002 09:45:25.765187 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20e5d97a-c526-45ce-a71b-8ffdcbbed003-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "20e5d97a-c526-45ce-a71b-8ffdcbbed003" (UID: "20e5d97a-c526-45ce-a71b-8ffdcbbed003"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:45:25 crc kubenswrapper[4722]: I1002 09:45:25.791603 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpxpk\" (UniqueName: \"kubernetes.io/projected/20e5d97a-c526-45ce-a71b-8ffdcbbed003-kube-api-access-vpxpk\") on node \"crc\" DevicePath \"\"" Oct 02 09:45:25 crc kubenswrapper[4722]: I1002 09:45:25.791633 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20e5d97a-c526-45ce-a71b-8ffdcbbed003-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 09:45:25 crc kubenswrapper[4722]: I1002 09:45:25.791646 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20e5d97a-c526-45ce-a71b-8ffdcbbed003-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.180225 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5559d9646f-85qkg"] Oct 02 09:45:26 crc kubenswrapper[4722]: E1002 09:45:26.180480 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98e4ffb9-0cc0-4196-8f18-08f3a5827133" containerName="util" Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.180494 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="98e4ffb9-0cc0-4196-8f18-08f3a5827133" containerName="util" Oct 02 09:45:26 crc kubenswrapper[4722]: E1002 09:45:26.180507 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20e5d97a-c526-45ce-a71b-8ffdcbbed003" containerName="registry-server" Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.180514 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="20e5d97a-c526-45ce-a71b-8ffdcbbed003" containerName="registry-server" Oct 02 09:45:26 crc kubenswrapper[4722]: E1002 09:45:26.180523 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98e4ffb9-0cc0-4196-8f18-08f3a5827133" containerName="pull" Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.180531 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="98e4ffb9-0cc0-4196-8f18-08f3a5827133" containerName="pull" Oct 02 09:45:26 crc kubenswrapper[4722]: E1002 09:45:26.180543 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20e5d97a-c526-45ce-a71b-8ffdcbbed003" containerName="extract-utilities" Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.180551 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="20e5d97a-c526-45ce-a71b-8ffdcbbed003" containerName="extract-utilities" Oct 02 09:45:26 crc kubenswrapper[4722]: E1002 09:45:26.180565 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20e5d97a-c526-45ce-a71b-8ffdcbbed003" containerName="extract-content" Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.180572 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="20e5d97a-c526-45ce-a71b-8ffdcbbed003" containerName="extract-content" Oct 02 09:45:26 crc kubenswrapper[4722]: E1002 09:45:26.180582 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98e4ffb9-0cc0-4196-8f18-08f3a5827133" containerName="extract" Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.180588 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="98e4ffb9-0cc0-4196-8f18-08f3a5827133" containerName="extract" Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.180691 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="98e4ffb9-0cc0-4196-8f18-08f3a5827133" containerName="extract" Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.180707 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="20e5d97a-c526-45ce-a71b-8ffdcbbed003" containerName="registry-server" Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.181103 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5559d9646f-85qkg" Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.182917 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-2jdjm" Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.183007 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.183816 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.183936 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.196819 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.198013 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5559d9646f-85qkg"] Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.299341 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whxcg\" (UniqueName: \"kubernetes.io/projected/68b65273-5373-4afe-86e2-e252c7d5b8a2-kube-api-access-whxcg\") pod \"metallb-operator-controller-manager-5559d9646f-85qkg\" (UID: \"68b65273-5373-4afe-86e2-e252c7d5b8a2\") " pod="metallb-system/metallb-operator-controller-manager-5559d9646f-85qkg" Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.299428 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/68b65273-5373-4afe-86e2-e252c7d5b8a2-webhook-cert\") pod \"metallb-operator-controller-manager-5559d9646f-85qkg\" (UID: \"68b65273-5373-4afe-86e2-e252c7d5b8a2\") " pod="metallb-system/metallb-operator-controller-manager-5559d9646f-85qkg" Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.299501 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/68b65273-5373-4afe-86e2-e252c7d5b8a2-apiservice-cert\") pod \"metallb-operator-controller-manager-5559d9646f-85qkg\" (UID: \"68b65273-5373-4afe-86e2-e252c7d5b8a2\") " pod="metallb-system/metallb-operator-controller-manager-5559d9646f-85qkg" Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.400825 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whxcg\" (UniqueName: \"kubernetes.io/projected/68b65273-5373-4afe-86e2-e252c7d5b8a2-kube-api-access-whxcg\") pod \"metallb-operator-controller-manager-5559d9646f-85qkg\" (UID: \"68b65273-5373-4afe-86e2-e252c7d5b8a2\") " pod="metallb-system/metallb-operator-controller-manager-5559d9646f-85qkg" Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.401648 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/68b65273-5373-4afe-86e2-e252c7d5b8a2-webhook-cert\") pod \"metallb-operator-controller-manager-5559d9646f-85qkg\" (UID: \"68b65273-5373-4afe-86e2-e252c7d5b8a2\") " pod="metallb-system/metallb-operator-controller-manager-5559d9646f-85qkg" Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.401747 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/68b65273-5373-4afe-86e2-e252c7d5b8a2-apiservice-cert\") pod \"metallb-operator-controller-manager-5559d9646f-85qkg\" (UID: \"68b65273-5373-4afe-86e2-e252c7d5b8a2\") " pod="metallb-system/metallb-operator-controller-manager-5559d9646f-85qkg" Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.406140 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/68b65273-5373-4afe-86e2-e252c7d5b8a2-apiservice-cert\") pod \"metallb-operator-controller-manager-5559d9646f-85qkg\" (UID: \"68b65273-5373-4afe-86e2-e252c7d5b8a2\") " pod="metallb-system/metallb-operator-controller-manager-5559d9646f-85qkg" Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.408905 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/68b65273-5373-4afe-86e2-e252c7d5b8a2-webhook-cert\") pod \"metallb-operator-controller-manager-5559d9646f-85qkg\" (UID: \"68b65273-5373-4afe-86e2-e252c7d5b8a2\") " pod="metallb-system/metallb-operator-controller-manager-5559d9646f-85qkg" Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.442229 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whxcg\" (UniqueName: \"kubernetes.io/projected/68b65273-5373-4afe-86e2-e252c7d5b8a2-kube-api-access-whxcg\") pod \"metallb-operator-controller-manager-5559d9646f-85qkg\" (UID: \"68b65273-5373-4afe-86e2-e252c7d5b8a2\") " pod="metallb-system/metallb-operator-controller-manager-5559d9646f-85qkg" Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.444866 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-546df5c677-k4c6f"] Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.448379 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-546df5c677-k4c6f" Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.453975 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.454222 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-7hvjt" Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.454494 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.457564 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzghp" event={"ID":"20e5d97a-c526-45ce-a71b-8ffdcbbed003","Type":"ContainerDied","Data":"5bfe30ca6e17c1aa596fb8bb59632eecef618a4942b6f97989c807749da47918"} Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.457645 4722 scope.go:117] "RemoveContainer" containerID="b04c78cd6aba76d8314bdfd70bbaaa8142160db9d59949e46421c93a64fb221e" Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.457821 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rzghp" Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.469394 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-546df5c677-k4c6f"] Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.480129 4722 scope.go:117] "RemoveContainer" containerID="748e063f0bfa7e11d9a275b813b3c75e8525d9a0dc103f1deab0091c44b052f1" Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.497713 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5559d9646f-85qkg" Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.509454 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rzghp"] Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.511681 4722 scope.go:117] "RemoveContainer" containerID="5ffea75b8e58aa7ea94bea94762735bc56417cc39ade1bf2d8ab0d6b9d999058" Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.513173 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rzghp"] Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.607264 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cc0ccdf8-90ba-43dc-9c71-e002f1fe038c-apiservice-cert\") pod \"metallb-operator-webhook-server-546df5c677-k4c6f\" (UID: \"cc0ccdf8-90ba-43dc-9c71-e002f1fe038c\") " pod="metallb-system/metallb-operator-webhook-server-546df5c677-k4c6f" Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.607307 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7txz\" (UniqueName: \"kubernetes.io/projected/cc0ccdf8-90ba-43dc-9c71-e002f1fe038c-kube-api-access-r7txz\") pod \"metallb-operator-webhook-server-546df5c677-k4c6f\" (UID: \"cc0ccdf8-90ba-43dc-9c71-e002f1fe038c\") " pod="metallb-system/metallb-operator-webhook-server-546df5c677-k4c6f" Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.607350 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cc0ccdf8-90ba-43dc-9c71-e002f1fe038c-webhook-cert\") pod \"metallb-operator-webhook-server-546df5c677-k4c6f\" (UID: \"cc0ccdf8-90ba-43dc-9c71-e002f1fe038c\") " pod="metallb-system/metallb-operator-webhook-server-546df5c677-k4c6f" Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.708242 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cc0ccdf8-90ba-43dc-9c71-e002f1fe038c-apiservice-cert\") pod \"metallb-operator-webhook-server-546df5c677-k4c6f\" (UID: \"cc0ccdf8-90ba-43dc-9c71-e002f1fe038c\") " pod="metallb-system/metallb-operator-webhook-server-546df5c677-k4c6f" Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.708333 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7txz\" (UniqueName: \"kubernetes.io/projected/cc0ccdf8-90ba-43dc-9c71-e002f1fe038c-kube-api-access-r7txz\") pod \"metallb-operator-webhook-server-546df5c677-k4c6f\" (UID: \"cc0ccdf8-90ba-43dc-9c71-e002f1fe038c\") " pod="metallb-system/metallb-operator-webhook-server-546df5c677-k4c6f" Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.708374 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cc0ccdf8-90ba-43dc-9c71-e002f1fe038c-webhook-cert\") pod \"metallb-operator-webhook-server-546df5c677-k4c6f\" (UID: \"cc0ccdf8-90ba-43dc-9c71-e002f1fe038c\") " pod="metallb-system/metallb-operator-webhook-server-546df5c677-k4c6f" Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.713936 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cc0ccdf8-90ba-43dc-9c71-e002f1fe038c-webhook-cert\") pod \"metallb-operator-webhook-server-546df5c677-k4c6f\" (UID: \"cc0ccdf8-90ba-43dc-9c71-e002f1fe038c\") " pod="metallb-system/metallb-operator-webhook-server-546df5c677-k4c6f" Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.716892 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cc0ccdf8-90ba-43dc-9c71-e002f1fe038c-apiservice-cert\") pod \"metallb-operator-webhook-server-546df5c677-k4c6f\" (UID: \"cc0ccdf8-90ba-43dc-9c71-e002f1fe038c\") " pod="metallb-system/metallb-operator-webhook-server-546df5c677-k4c6f" Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.729661 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20e5d97a-c526-45ce-a71b-8ffdcbbed003" path="/var/lib/kubelet/pods/20e5d97a-c526-45ce-a71b-8ffdcbbed003/volumes" Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.764982 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7txz\" (UniqueName: \"kubernetes.io/projected/cc0ccdf8-90ba-43dc-9c71-e002f1fe038c-kube-api-access-r7txz\") pod \"metallb-operator-webhook-server-546df5c677-k4c6f\" (UID: \"cc0ccdf8-90ba-43dc-9c71-e002f1fe038c\") " pod="metallb-system/metallb-operator-webhook-server-546df5c677-k4c6f" Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.791465 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-546df5c677-k4c6f" Oct 02 09:45:26 crc kubenswrapper[4722]: I1002 09:45:26.836819 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5559d9646f-85qkg"] Oct 02 09:45:27 crc kubenswrapper[4722]: I1002 09:45:27.268451 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-546df5c677-k4c6f"] Oct 02 09:45:27 crc kubenswrapper[4722]: W1002 09:45:27.278398 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc0ccdf8_90ba_43dc_9c71_e002f1fe038c.slice/crio-aa7bfa0f2c9c1b679f01ed1d55f7224ede54eab23184f8530fa8ace9c2c497e0 WatchSource:0}: Error finding container aa7bfa0f2c9c1b679f01ed1d55f7224ede54eab23184f8530fa8ace9c2c497e0: Status 404 returned error can't find the container with id aa7bfa0f2c9c1b679f01ed1d55f7224ede54eab23184f8530fa8ace9c2c497e0 Oct 02 09:45:27 crc kubenswrapper[4722]: I1002 09:45:27.465422 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5559d9646f-85qkg" event={"ID":"68b65273-5373-4afe-86e2-e252c7d5b8a2","Type":"ContainerStarted","Data":"bcc1a0b520e698e2cb0d337203b985eaf127fd46593a9d8db58e9b0bda3a6986"} Oct 02 09:45:27 crc kubenswrapper[4722]: I1002 09:45:27.467866 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-546df5c677-k4c6f" event={"ID":"cc0ccdf8-90ba-43dc-9c71-e002f1fe038c","Type":"ContainerStarted","Data":"aa7bfa0f2c9c1b679f01ed1d55f7224ede54eab23184f8530fa8ace9c2c497e0"} Oct 02 09:45:31 crc kubenswrapper[4722]: I1002 09:45:31.493826 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5559d9646f-85qkg" event={"ID":"68b65273-5373-4afe-86e2-e252c7d5b8a2","Type":"ContainerStarted","Data":"35136f99ec72c85d6a5fdfa39de36c8c3dc0a43f82cf9ac9ebb8cb713a2d1a26"} Oct 02 09:45:31 crc kubenswrapper[4722]: I1002 09:45:31.494713 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5559d9646f-85qkg" Oct 02 09:45:31 crc kubenswrapper[4722]: I1002 09:45:31.520121 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5559d9646f-85qkg" podStartSLOduration=1.781161264 podStartE2EDuration="5.520103268s" podCreationTimestamp="2025-10-02 09:45:26 +0000 UTC" firstStartedPulling="2025-10-02 09:45:26.859581868 +0000 UTC m=+782.896334848" lastFinishedPulling="2025-10-02 09:45:30.598523862 +0000 UTC m=+786.635276852" observedRunningTime="2025-10-02 09:45:31.517821242 +0000 UTC m=+787.554574242" watchObservedRunningTime="2025-10-02 09:45:31.520103268 +0000 UTC m=+787.556856258" Oct 02 09:45:33 crc kubenswrapper[4722]: I1002 09:45:33.511579 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-546df5c677-k4c6f" event={"ID":"cc0ccdf8-90ba-43dc-9c71-e002f1fe038c","Type":"ContainerStarted","Data":"30de63a8217a3f7e16f40b76a730d457801dde4b1ead578a60ac2ff62df390f7"} Oct 02 09:45:33 crc kubenswrapper[4722]: I1002 09:45:33.512124 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-546df5c677-k4c6f" Oct 02 09:45:33 crc kubenswrapper[4722]: I1002 09:45:33.535899 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-546df5c677-k4c6f" podStartSLOduration=2.297762207 podStartE2EDuration="7.535872751s" podCreationTimestamp="2025-10-02 09:45:26 +0000 UTC" firstStartedPulling="2025-10-02 09:45:27.281333968 +0000 UTC m=+783.318086948" lastFinishedPulling="2025-10-02 09:45:32.519444512 +0000 UTC m=+788.556197492" observedRunningTime="2025-10-02 09:45:33.531475441 +0000 UTC m=+789.568228441" watchObservedRunningTime="2025-10-02 09:45:33.535872751 +0000 UTC m=+789.572625731" Oct 02 09:45:40 crc kubenswrapper[4722]: I1002 09:45:40.696999 4722 patch_prober.go:28] interesting pod/machine-config-daemon-q6h76 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 09:45:40 crc kubenswrapper[4722]: I1002 09:45:40.697597 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" podUID="f662826c-e772-4f56-a85f-83971aaef356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 09:45:42 crc kubenswrapper[4722]: I1002 09:45:42.897492 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-b9ndg"] Oct 02 09:45:42 crc kubenswrapper[4722]: I1002 09:45:42.898817 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b9ndg" Oct 02 09:45:42 crc kubenswrapper[4722]: I1002 09:45:42.913077 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b9ndg"] Oct 02 09:45:43 crc kubenswrapper[4722]: I1002 09:45:43.075293 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3863b38a-9a0c-4fa3-909d-3b082b55197f-utilities\") pod \"certified-operators-b9ndg\" (UID: \"3863b38a-9a0c-4fa3-909d-3b082b55197f\") " pod="openshift-marketplace/certified-operators-b9ndg" Oct 02 09:45:43 crc kubenswrapper[4722]: I1002 09:45:43.075515 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26z57\" (UniqueName: \"kubernetes.io/projected/3863b38a-9a0c-4fa3-909d-3b082b55197f-kube-api-access-26z57\") pod \"certified-operators-b9ndg\" (UID: \"3863b38a-9a0c-4fa3-909d-3b082b55197f\") " pod="openshift-marketplace/certified-operators-b9ndg" Oct 02 09:45:43 crc kubenswrapper[4722]: I1002 09:45:43.075632 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3863b38a-9a0c-4fa3-909d-3b082b55197f-catalog-content\") pod \"certified-operators-b9ndg\" (UID: \"3863b38a-9a0c-4fa3-909d-3b082b55197f\") " pod="openshift-marketplace/certified-operators-b9ndg" Oct 02 09:45:43 crc kubenswrapper[4722]: I1002 09:45:43.177239 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3863b38a-9a0c-4fa3-909d-3b082b55197f-utilities\") pod \"certified-operators-b9ndg\" (UID: \"3863b38a-9a0c-4fa3-909d-3b082b55197f\") " pod="openshift-marketplace/certified-operators-b9ndg" Oct 02 09:45:43 crc kubenswrapper[4722]: I1002 09:45:43.177341 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26z57\" (UniqueName: \"kubernetes.io/projected/3863b38a-9a0c-4fa3-909d-3b082b55197f-kube-api-access-26z57\") pod \"certified-operators-b9ndg\" (UID: \"3863b38a-9a0c-4fa3-909d-3b082b55197f\") " pod="openshift-marketplace/certified-operators-b9ndg" Oct 02 09:45:43 crc kubenswrapper[4722]: I1002 09:45:43.177395 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3863b38a-9a0c-4fa3-909d-3b082b55197f-catalog-content\") pod \"certified-operators-b9ndg\" (UID: \"3863b38a-9a0c-4fa3-909d-3b082b55197f\") " pod="openshift-marketplace/certified-operators-b9ndg" Oct 02 09:45:43 crc kubenswrapper[4722]: I1002 09:45:43.178117 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3863b38a-9a0c-4fa3-909d-3b082b55197f-catalog-content\") pod \"certified-operators-b9ndg\" (UID: \"3863b38a-9a0c-4fa3-909d-3b082b55197f\") " pod="openshift-marketplace/certified-operators-b9ndg" Oct 02 09:45:43 crc kubenswrapper[4722]: I1002 09:45:43.178234 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3863b38a-9a0c-4fa3-909d-3b082b55197f-utilities\") pod \"certified-operators-b9ndg\" (UID: \"3863b38a-9a0c-4fa3-909d-3b082b55197f\") " pod="openshift-marketplace/certified-operators-b9ndg" Oct 02 09:45:43 crc kubenswrapper[4722]: I1002 09:45:43.203950 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26z57\" (UniqueName: \"kubernetes.io/projected/3863b38a-9a0c-4fa3-909d-3b082b55197f-kube-api-access-26z57\") pod \"certified-operators-b9ndg\" (UID: \"3863b38a-9a0c-4fa3-909d-3b082b55197f\") " pod="openshift-marketplace/certified-operators-b9ndg" Oct 02 09:45:43 crc kubenswrapper[4722]: I1002 09:45:43.287120 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b9ndg" Oct 02 09:45:43 crc kubenswrapper[4722]: I1002 09:45:43.848108 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b9ndg"] Oct 02 09:45:43 crc kubenswrapper[4722]: W1002 09:45:43.856975 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3863b38a_9a0c_4fa3_909d_3b082b55197f.slice/crio-d687cdad1badd15824de76e615711e6f93ee63a21bcde85af726dbd49f7e3cac WatchSource:0}: Error finding container d687cdad1badd15824de76e615711e6f93ee63a21bcde85af726dbd49f7e3cac: Status 404 returned error can't find the container with id d687cdad1badd15824de76e615711e6f93ee63a21bcde85af726dbd49f7e3cac Oct 02 09:45:44 crc kubenswrapper[4722]: I1002 09:45:44.569533 4722 generic.go:334] "Generic (PLEG): container finished" podID="3863b38a-9a0c-4fa3-909d-3b082b55197f" containerID="673fc3eb67cb47462eef84e6215d50964312e96071b21508f8ad801fb1161898" exitCode=0 Oct 02 09:45:44 crc kubenswrapper[4722]: I1002 09:45:44.569592 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b9ndg" event={"ID":"3863b38a-9a0c-4fa3-909d-3b082b55197f","Type":"ContainerDied","Data":"673fc3eb67cb47462eef84e6215d50964312e96071b21508f8ad801fb1161898"} Oct 02 09:45:44 crc kubenswrapper[4722]: I1002 09:45:44.569625 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b9ndg" event={"ID":"3863b38a-9a0c-4fa3-909d-3b082b55197f","Type":"ContainerStarted","Data":"d687cdad1badd15824de76e615711e6f93ee63a21bcde85af726dbd49f7e3cac"} Oct 02 09:45:46 crc kubenswrapper[4722]: I1002 09:45:46.584673 4722 generic.go:334] "Generic (PLEG): container finished" podID="3863b38a-9a0c-4fa3-909d-3b082b55197f" containerID="c41137e15f0ef3055571ed3b34043599c5a1f26bf4cdc04f047d8a1801da5061" exitCode=0 Oct 02 09:45:46 crc kubenswrapper[4722]: I1002 09:45:46.584801 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b9ndg" event={"ID":"3863b38a-9a0c-4fa3-909d-3b082b55197f","Type":"ContainerDied","Data":"c41137e15f0ef3055571ed3b34043599c5a1f26bf4cdc04f047d8a1801da5061"} Oct 02 09:45:46 crc kubenswrapper[4722]: I1002 09:45:46.796350 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-546df5c677-k4c6f" Oct 02 09:45:47 crc kubenswrapper[4722]: I1002 09:45:47.603278 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b9ndg" event={"ID":"3863b38a-9a0c-4fa3-909d-3b082b55197f","Type":"ContainerStarted","Data":"85177167b48bf348050b94800fe72be2cbd3c8fc75b0f9b6d0e7266f9debc4ce"} Oct 02 09:45:47 crc kubenswrapper[4722]: I1002 09:45:47.620165 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-b9ndg" podStartSLOduration=3.129577696 podStartE2EDuration="5.620136771s" podCreationTimestamp="2025-10-02 09:45:42 +0000 UTC" firstStartedPulling="2025-10-02 09:45:44.570950275 +0000 UTC m=+800.607703255" lastFinishedPulling="2025-10-02 09:45:47.06150935 +0000 UTC m=+803.098262330" observedRunningTime="2025-10-02 09:45:47.618873759 +0000 UTC m=+803.655626749" watchObservedRunningTime="2025-10-02 09:45:47.620136771 +0000 UTC m=+803.656889751" Oct 02 09:45:53 crc kubenswrapper[4722]: I1002 09:45:53.287713 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-b9ndg" Oct 02 09:45:53 crc kubenswrapper[4722]: I1002 09:45:53.288142 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-b9ndg" Oct 02 09:45:53 crc kubenswrapper[4722]: I1002 09:45:53.337072 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-b9ndg" Oct 02 09:45:53 crc kubenswrapper[4722]: I1002 09:45:53.677785 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-b9ndg" Oct 02 09:45:53 crc kubenswrapper[4722]: I1002 09:45:53.716864 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b9ndg"] Oct 02 09:45:55 crc kubenswrapper[4722]: I1002 09:45:55.650902 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-b9ndg" podUID="3863b38a-9a0c-4fa3-909d-3b082b55197f" containerName="registry-server" containerID="cri-o://85177167b48bf348050b94800fe72be2cbd3c8fc75b0f9b6d0e7266f9debc4ce" gracePeriod=2 Oct 02 09:45:56 crc kubenswrapper[4722]: I1002 09:45:56.047630 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b9ndg" Oct 02 09:45:56 crc kubenswrapper[4722]: I1002 09:45:56.150804 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3863b38a-9a0c-4fa3-909d-3b082b55197f-utilities\") pod \"3863b38a-9a0c-4fa3-909d-3b082b55197f\" (UID: \"3863b38a-9a0c-4fa3-909d-3b082b55197f\") " Oct 02 09:45:56 crc kubenswrapper[4722]: I1002 09:45:56.150859 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3863b38a-9a0c-4fa3-909d-3b082b55197f-catalog-content\") pod \"3863b38a-9a0c-4fa3-909d-3b082b55197f\" (UID: \"3863b38a-9a0c-4fa3-909d-3b082b55197f\") " Oct 02 09:45:56 crc kubenswrapper[4722]: I1002 09:45:56.150924 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26z57\" (UniqueName: \"kubernetes.io/projected/3863b38a-9a0c-4fa3-909d-3b082b55197f-kube-api-access-26z57\") pod \"3863b38a-9a0c-4fa3-909d-3b082b55197f\" (UID: \"3863b38a-9a0c-4fa3-909d-3b082b55197f\") " Oct 02 09:45:56 crc kubenswrapper[4722]: I1002 09:45:56.152241 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3863b38a-9a0c-4fa3-909d-3b082b55197f-utilities" (OuterVolumeSpecName: "utilities") pod "3863b38a-9a0c-4fa3-909d-3b082b55197f" (UID: "3863b38a-9a0c-4fa3-909d-3b082b55197f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:45:56 crc kubenswrapper[4722]: I1002 09:45:56.157658 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3863b38a-9a0c-4fa3-909d-3b082b55197f-kube-api-access-26z57" (OuterVolumeSpecName: "kube-api-access-26z57") pod "3863b38a-9a0c-4fa3-909d-3b082b55197f" (UID: "3863b38a-9a0c-4fa3-909d-3b082b55197f"). InnerVolumeSpecName "kube-api-access-26z57". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:45:56 crc kubenswrapper[4722]: I1002 09:45:56.205259 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3863b38a-9a0c-4fa3-909d-3b082b55197f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3863b38a-9a0c-4fa3-909d-3b082b55197f" (UID: "3863b38a-9a0c-4fa3-909d-3b082b55197f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:45:56 crc kubenswrapper[4722]: I1002 09:45:56.252882 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3863b38a-9a0c-4fa3-909d-3b082b55197f-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 09:45:56 crc kubenswrapper[4722]: I1002 09:45:56.252933 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3863b38a-9a0c-4fa3-909d-3b082b55197f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 09:45:56 crc kubenswrapper[4722]: I1002 09:45:56.252950 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26z57\" (UniqueName: \"kubernetes.io/projected/3863b38a-9a0c-4fa3-909d-3b082b55197f-kube-api-access-26z57\") on node \"crc\" DevicePath \"\"" Oct 02 09:45:56 crc kubenswrapper[4722]: I1002 09:45:56.659795 4722 generic.go:334] "Generic (PLEG): container finished" podID="3863b38a-9a0c-4fa3-909d-3b082b55197f" containerID="85177167b48bf348050b94800fe72be2cbd3c8fc75b0f9b6d0e7266f9debc4ce" exitCode=0 Oct 02 09:45:56 crc kubenswrapper[4722]: I1002 09:45:56.659867 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b9ndg" event={"ID":"3863b38a-9a0c-4fa3-909d-3b082b55197f","Type":"ContainerDied","Data":"85177167b48bf348050b94800fe72be2cbd3c8fc75b0f9b6d0e7266f9debc4ce"} Oct 02 09:45:56 crc kubenswrapper[4722]: I1002 09:45:56.659912 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b9ndg" event={"ID":"3863b38a-9a0c-4fa3-909d-3b082b55197f","Type":"ContainerDied","Data":"d687cdad1badd15824de76e615711e6f93ee63a21bcde85af726dbd49f7e3cac"} Oct 02 09:45:56 crc kubenswrapper[4722]: I1002 09:45:56.659940 4722 scope.go:117] "RemoveContainer" containerID="85177167b48bf348050b94800fe72be2cbd3c8fc75b0f9b6d0e7266f9debc4ce" Oct 02 09:45:56 crc kubenswrapper[4722]: I1002 09:45:56.660194 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b9ndg" Oct 02 09:45:56 crc kubenswrapper[4722]: I1002 09:45:56.678277 4722 scope.go:117] "RemoveContainer" containerID="c41137e15f0ef3055571ed3b34043599c5a1f26bf4cdc04f047d8a1801da5061" Oct 02 09:45:56 crc kubenswrapper[4722]: I1002 09:45:56.694153 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b9ndg"] Oct 02 09:45:56 crc kubenswrapper[4722]: I1002 09:45:56.705360 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-b9ndg"] Oct 02 09:45:56 crc kubenswrapper[4722]: I1002 09:45:56.716106 4722 scope.go:117] "RemoveContainer" containerID="673fc3eb67cb47462eef84e6215d50964312e96071b21508f8ad801fb1161898" Oct 02 09:45:56 crc kubenswrapper[4722]: I1002 09:45:56.718124 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3863b38a-9a0c-4fa3-909d-3b082b55197f" path="/var/lib/kubelet/pods/3863b38a-9a0c-4fa3-909d-3b082b55197f/volumes" Oct 02 09:45:56 crc kubenswrapper[4722]: I1002 09:45:56.732070 4722 scope.go:117] "RemoveContainer" containerID="85177167b48bf348050b94800fe72be2cbd3c8fc75b0f9b6d0e7266f9debc4ce" Oct 02 09:45:56 crc kubenswrapper[4722]: E1002 09:45:56.732636 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85177167b48bf348050b94800fe72be2cbd3c8fc75b0f9b6d0e7266f9debc4ce\": container with ID starting with 85177167b48bf348050b94800fe72be2cbd3c8fc75b0f9b6d0e7266f9debc4ce not found: ID does not exist" containerID="85177167b48bf348050b94800fe72be2cbd3c8fc75b0f9b6d0e7266f9debc4ce" Oct 02 09:45:56 crc kubenswrapper[4722]: I1002 09:45:56.732756 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85177167b48bf348050b94800fe72be2cbd3c8fc75b0f9b6d0e7266f9debc4ce"} err="failed to get container status \"85177167b48bf348050b94800fe72be2cbd3c8fc75b0f9b6d0e7266f9debc4ce\": rpc error: code = NotFound desc = could not find container \"85177167b48bf348050b94800fe72be2cbd3c8fc75b0f9b6d0e7266f9debc4ce\": container with ID starting with 85177167b48bf348050b94800fe72be2cbd3c8fc75b0f9b6d0e7266f9debc4ce not found: ID does not exist" Oct 02 09:45:56 crc kubenswrapper[4722]: I1002 09:45:56.732852 4722 scope.go:117] "RemoveContainer" containerID="c41137e15f0ef3055571ed3b34043599c5a1f26bf4cdc04f047d8a1801da5061" Oct 02 09:45:56 crc kubenswrapper[4722]: E1002 09:45:56.734013 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c41137e15f0ef3055571ed3b34043599c5a1f26bf4cdc04f047d8a1801da5061\": container with ID starting with c41137e15f0ef3055571ed3b34043599c5a1f26bf4cdc04f047d8a1801da5061 not found: ID does not exist" containerID="c41137e15f0ef3055571ed3b34043599c5a1f26bf4cdc04f047d8a1801da5061" Oct 02 09:45:56 crc kubenswrapper[4722]: I1002 09:45:56.734072 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c41137e15f0ef3055571ed3b34043599c5a1f26bf4cdc04f047d8a1801da5061"} err="failed to get container status \"c41137e15f0ef3055571ed3b34043599c5a1f26bf4cdc04f047d8a1801da5061\": rpc error: code = NotFound desc = could not find container \"c41137e15f0ef3055571ed3b34043599c5a1f26bf4cdc04f047d8a1801da5061\": container with ID starting with c41137e15f0ef3055571ed3b34043599c5a1f26bf4cdc04f047d8a1801da5061 not found: ID does not exist" Oct 02 09:45:56 crc kubenswrapper[4722]: I1002 09:45:56.734111 4722 scope.go:117] "RemoveContainer" containerID="673fc3eb67cb47462eef84e6215d50964312e96071b21508f8ad801fb1161898" Oct 02 09:45:56 crc kubenswrapper[4722]: E1002 09:45:56.734515 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"673fc3eb67cb47462eef84e6215d50964312e96071b21508f8ad801fb1161898\": container with ID starting with 673fc3eb67cb47462eef84e6215d50964312e96071b21508f8ad801fb1161898 not found: ID does not exist" containerID="673fc3eb67cb47462eef84e6215d50964312e96071b21508f8ad801fb1161898" Oct 02 09:45:56 crc kubenswrapper[4722]: I1002 09:45:56.734553 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"673fc3eb67cb47462eef84e6215d50964312e96071b21508f8ad801fb1161898"} err="failed to get container status \"673fc3eb67cb47462eef84e6215d50964312e96071b21508f8ad801fb1161898\": rpc error: code = NotFound desc = could not find container \"673fc3eb67cb47462eef84e6215d50964312e96071b21508f8ad801fb1161898\": container with ID starting with 673fc3eb67cb47462eef84e6215d50964312e96071b21508f8ad801fb1161898 not found: ID does not exist" Oct 02 09:46:06 crc kubenswrapper[4722]: I1002 09:46:06.501777 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5559d9646f-85qkg" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.213180 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-pl8wr"] Oct 02 09:46:07 crc kubenswrapper[4722]: E1002 09:46:07.213484 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3863b38a-9a0c-4fa3-909d-3b082b55197f" containerName="extract-content" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.213500 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3863b38a-9a0c-4fa3-909d-3b082b55197f" containerName="extract-content" Oct 02 09:46:07 crc kubenswrapper[4722]: E1002 09:46:07.213509 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3863b38a-9a0c-4fa3-909d-3b082b55197f" containerName="registry-server" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.213515 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3863b38a-9a0c-4fa3-909d-3b082b55197f" containerName="registry-server" Oct 02 09:46:07 crc kubenswrapper[4722]: E1002 09:46:07.213529 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3863b38a-9a0c-4fa3-909d-3b082b55197f" containerName="extract-utilities" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.213536 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3863b38a-9a0c-4fa3-909d-3b082b55197f" containerName="extract-utilities" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.213646 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="3863b38a-9a0c-4fa3-909d-3b082b55197f" containerName="registry-server" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.215468 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-pl8wr" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.219138 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.219275 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.219629 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-vphtv" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.219796 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-zj8n7"] Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.220996 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zj8n7" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.223944 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.232741 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-zj8n7"] Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.300921 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17b8e73d-a3a0-4f67-9634-8e7f9f6038e8-cert\") pod \"frr-k8s-webhook-server-64bf5d555-zj8n7\" (UID: \"17b8e73d-a3a0-4f67-9634-8e7f9f6038e8\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zj8n7" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.300972 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl29f\" (UniqueName: \"kubernetes.io/projected/c20d6613-a91d-4afc-96a1-efbdffba4415-kube-api-access-bl29f\") pod \"frr-k8s-pl8wr\" (UID: \"c20d6613-a91d-4afc-96a1-efbdffba4415\") " pod="metallb-system/frr-k8s-pl8wr" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.300991 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/c20d6613-a91d-4afc-96a1-efbdffba4415-reloader\") pod \"frr-k8s-pl8wr\" (UID: \"c20d6613-a91d-4afc-96a1-efbdffba4415\") " pod="metallb-system/frr-k8s-pl8wr" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.301010 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpw4d\" (UniqueName: \"kubernetes.io/projected/17b8e73d-a3a0-4f67-9634-8e7f9f6038e8-kube-api-access-bpw4d\") pod \"frr-k8s-webhook-server-64bf5d555-zj8n7\" (UID: \"17b8e73d-a3a0-4f67-9634-8e7f9f6038e8\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zj8n7" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.301035 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/c20d6613-a91d-4afc-96a1-efbdffba4415-frr-sockets\") pod \"frr-k8s-pl8wr\" (UID: \"c20d6613-a91d-4afc-96a1-efbdffba4415\") " pod="metallb-system/frr-k8s-pl8wr" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.301155 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/c20d6613-a91d-4afc-96a1-efbdffba4415-metrics\") pod \"frr-k8s-pl8wr\" (UID: \"c20d6613-a91d-4afc-96a1-efbdffba4415\") " pod="metallb-system/frr-k8s-pl8wr" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.301199 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/c20d6613-a91d-4afc-96a1-efbdffba4415-frr-conf\") pod \"frr-k8s-pl8wr\" (UID: \"c20d6613-a91d-4afc-96a1-efbdffba4415\") " pod="metallb-system/frr-k8s-pl8wr" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.301216 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/c20d6613-a91d-4afc-96a1-efbdffba4415-frr-startup\") pod \"frr-k8s-pl8wr\" (UID: \"c20d6613-a91d-4afc-96a1-efbdffba4415\") " pod="metallb-system/frr-k8s-pl8wr" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.301408 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c20d6613-a91d-4afc-96a1-efbdffba4415-metrics-certs\") pod \"frr-k8s-pl8wr\" (UID: \"c20d6613-a91d-4afc-96a1-efbdffba4415\") " pod="metallb-system/frr-k8s-pl8wr" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.306919 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-clhg2"] Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.308097 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-clhg2" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.310287 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.310333 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-qsmhv" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.310347 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.310465 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.320675 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-62lfz"] Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.321590 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-62lfz" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.323430 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.340660 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-62lfz"] Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.403180 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b8fc405-8054-44d8-8213-089c88a2c0a4-metrics-certs\") pod \"controller-68d546b9d8-62lfz\" (UID: \"0b8fc405-8054-44d8-8213-089c88a2c0a4\") " pod="metallb-system/controller-68d546b9d8-62lfz" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.403542 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l7sn\" (UniqueName: \"kubernetes.io/projected/97a5fee1-9fe2-4292-82d0-a743eff0ecbf-kube-api-access-5l7sn\") pod \"speaker-clhg2\" (UID: \"97a5fee1-9fe2-4292-82d0-a743eff0ecbf\") " pod="metallb-system/speaker-clhg2" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.403572 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/97a5fee1-9fe2-4292-82d0-a743eff0ecbf-metallb-excludel2\") pod \"speaker-clhg2\" (UID: \"97a5fee1-9fe2-4292-82d0-a743eff0ecbf\") " pod="metallb-system/speaker-clhg2" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.403608 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c20d6613-a91d-4afc-96a1-efbdffba4415-metrics-certs\") pod \"frr-k8s-pl8wr\" (UID: \"c20d6613-a91d-4afc-96a1-efbdffba4415\") " pod="metallb-system/frr-k8s-pl8wr" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.403630 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnvf7\" (UniqueName: \"kubernetes.io/projected/0b8fc405-8054-44d8-8213-089c88a2c0a4-kube-api-access-jnvf7\") pod \"controller-68d546b9d8-62lfz\" (UID: \"0b8fc405-8054-44d8-8213-089c88a2c0a4\") " pod="metallb-system/controller-68d546b9d8-62lfz" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.403654 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/97a5fee1-9fe2-4292-82d0-a743eff0ecbf-memberlist\") pod \"speaker-clhg2\" (UID: \"97a5fee1-9fe2-4292-82d0-a743eff0ecbf\") " pod="metallb-system/speaker-clhg2" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.403712 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17b8e73d-a3a0-4f67-9634-8e7f9f6038e8-cert\") pod \"frr-k8s-webhook-server-64bf5d555-zj8n7\" (UID: \"17b8e73d-a3a0-4f67-9634-8e7f9f6038e8\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zj8n7" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.403736 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/97a5fee1-9fe2-4292-82d0-a743eff0ecbf-metrics-certs\") pod \"speaker-clhg2\" (UID: \"97a5fee1-9fe2-4292-82d0-a743eff0ecbf\") " pod="metallb-system/speaker-clhg2" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.403755 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bl29f\" (UniqueName: \"kubernetes.io/projected/c20d6613-a91d-4afc-96a1-efbdffba4415-kube-api-access-bl29f\") pod \"frr-k8s-pl8wr\" (UID: \"c20d6613-a91d-4afc-96a1-efbdffba4415\") " pod="metallb-system/frr-k8s-pl8wr" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.403771 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/c20d6613-a91d-4afc-96a1-efbdffba4415-reloader\") pod \"frr-k8s-pl8wr\" (UID: \"c20d6613-a91d-4afc-96a1-efbdffba4415\") " pod="metallb-system/frr-k8s-pl8wr" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.403789 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpw4d\" (UniqueName: \"kubernetes.io/projected/17b8e73d-a3a0-4f67-9634-8e7f9f6038e8-kube-api-access-bpw4d\") pod \"frr-k8s-webhook-server-64bf5d555-zj8n7\" (UID: \"17b8e73d-a3a0-4f67-9634-8e7f9f6038e8\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zj8n7" Oct 02 09:46:07 crc kubenswrapper[4722]: E1002 09:46:07.403803 4722 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.403824 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/c20d6613-a91d-4afc-96a1-efbdffba4415-frr-sockets\") pod \"frr-k8s-pl8wr\" (UID: \"c20d6613-a91d-4afc-96a1-efbdffba4415\") " pod="metallb-system/frr-k8s-pl8wr" Oct 02 09:46:07 crc kubenswrapper[4722]: E1002 09:46:07.403803 4722 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Oct 02 09:46:07 crc kubenswrapper[4722]: E1002 09:46:07.403995 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c20d6613-a91d-4afc-96a1-efbdffba4415-metrics-certs podName:c20d6613-a91d-4afc-96a1-efbdffba4415 nodeName:}" failed. No retries permitted until 2025-10-02 09:46:07.903869613 +0000 UTC m=+823.940622793 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c20d6613-a91d-4afc-96a1-efbdffba4415-metrics-certs") pod "frr-k8s-pl8wr" (UID: "c20d6613-a91d-4afc-96a1-efbdffba4415") : secret "frr-k8s-certs-secret" not found Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.404083 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0b8fc405-8054-44d8-8213-089c88a2c0a4-cert\") pod \"controller-68d546b9d8-62lfz\" (UID: \"0b8fc405-8054-44d8-8213-089c88a2c0a4\") " pod="metallb-system/controller-68d546b9d8-62lfz" Oct 02 09:46:07 crc kubenswrapper[4722]: E1002 09:46:07.404148 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17b8e73d-a3a0-4f67-9634-8e7f9f6038e8-cert podName:17b8e73d-a3a0-4f67-9634-8e7f9f6038e8 nodeName:}" failed. No retries permitted until 2025-10-02 09:46:07.9041308 +0000 UTC m=+823.940883780 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/17b8e73d-a3a0-4f67-9634-8e7f9f6038e8-cert") pod "frr-k8s-webhook-server-64bf5d555-zj8n7" (UID: "17b8e73d-a3a0-4f67-9634-8e7f9f6038e8") : secret "frr-k8s-webhook-server-cert" not found Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.404189 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/c20d6613-a91d-4afc-96a1-efbdffba4415-metrics\") pod \"frr-k8s-pl8wr\" (UID: \"c20d6613-a91d-4afc-96a1-efbdffba4415\") " pod="metallb-system/frr-k8s-pl8wr" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.404214 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/c20d6613-a91d-4afc-96a1-efbdffba4415-frr-conf\") pod \"frr-k8s-pl8wr\" (UID: \"c20d6613-a91d-4afc-96a1-efbdffba4415\") " pod="metallb-system/frr-k8s-pl8wr" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.404245 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/c20d6613-a91d-4afc-96a1-efbdffba4415-frr-startup\") pod \"frr-k8s-pl8wr\" (UID: \"c20d6613-a91d-4afc-96a1-efbdffba4415\") " pod="metallb-system/frr-k8s-pl8wr" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.404328 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/c20d6613-a91d-4afc-96a1-efbdffba4415-reloader\") pod \"frr-k8s-pl8wr\" (UID: \"c20d6613-a91d-4afc-96a1-efbdffba4415\") " pod="metallb-system/frr-k8s-pl8wr" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.404475 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/c20d6613-a91d-4afc-96a1-efbdffba4415-frr-sockets\") pod \"frr-k8s-pl8wr\" (UID: \"c20d6613-a91d-4afc-96a1-efbdffba4415\") " pod="metallb-system/frr-k8s-pl8wr" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.404570 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/c20d6613-a91d-4afc-96a1-efbdffba4415-frr-conf\") pod \"frr-k8s-pl8wr\" (UID: \"c20d6613-a91d-4afc-96a1-efbdffba4415\") " pod="metallb-system/frr-k8s-pl8wr" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.408599 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/c20d6613-a91d-4afc-96a1-efbdffba4415-metrics\") pod \"frr-k8s-pl8wr\" (UID: \"c20d6613-a91d-4afc-96a1-efbdffba4415\") " pod="metallb-system/frr-k8s-pl8wr" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.409295 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/c20d6613-a91d-4afc-96a1-efbdffba4415-frr-startup\") pod \"frr-k8s-pl8wr\" (UID: \"c20d6613-a91d-4afc-96a1-efbdffba4415\") " pod="metallb-system/frr-k8s-pl8wr" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.426095 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl29f\" (UniqueName: \"kubernetes.io/projected/c20d6613-a91d-4afc-96a1-efbdffba4415-kube-api-access-bl29f\") pod \"frr-k8s-pl8wr\" (UID: \"c20d6613-a91d-4afc-96a1-efbdffba4415\") " pod="metallb-system/frr-k8s-pl8wr" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.426198 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpw4d\" (UniqueName: \"kubernetes.io/projected/17b8e73d-a3a0-4f67-9634-8e7f9f6038e8-kube-api-access-bpw4d\") pod \"frr-k8s-webhook-server-64bf5d555-zj8n7\" (UID: \"17b8e73d-a3a0-4f67-9634-8e7f9f6038e8\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zj8n7" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.505293 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b8fc405-8054-44d8-8213-089c88a2c0a4-metrics-certs\") pod \"controller-68d546b9d8-62lfz\" (UID: \"0b8fc405-8054-44d8-8213-089c88a2c0a4\") " pod="metallb-system/controller-68d546b9d8-62lfz" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.505356 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l7sn\" (UniqueName: \"kubernetes.io/projected/97a5fee1-9fe2-4292-82d0-a743eff0ecbf-kube-api-access-5l7sn\") pod \"speaker-clhg2\" (UID: \"97a5fee1-9fe2-4292-82d0-a743eff0ecbf\") " pod="metallb-system/speaker-clhg2" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.505381 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/97a5fee1-9fe2-4292-82d0-a743eff0ecbf-metallb-excludel2\") pod \"speaker-clhg2\" (UID: \"97a5fee1-9fe2-4292-82d0-a743eff0ecbf\") " pod="metallb-system/speaker-clhg2" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.505437 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnvf7\" (UniqueName: \"kubernetes.io/projected/0b8fc405-8054-44d8-8213-089c88a2c0a4-kube-api-access-jnvf7\") pod \"controller-68d546b9d8-62lfz\" (UID: \"0b8fc405-8054-44d8-8213-089c88a2c0a4\") " pod="metallb-system/controller-68d546b9d8-62lfz" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.505466 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/97a5fee1-9fe2-4292-82d0-a743eff0ecbf-memberlist\") pod \"speaker-clhg2\" (UID: \"97a5fee1-9fe2-4292-82d0-a743eff0ecbf\") " pod="metallb-system/speaker-clhg2" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.505517 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/97a5fee1-9fe2-4292-82d0-a743eff0ecbf-metrics-certs\") pod \"speaker-clhg2\" (UID: \"97a5fee1-9fe2-4292-82d0-a743eff0ecbf\") " pod="metallb-system/speaker-clhg2" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.505543 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0b8fc405-8054-44d8-8213-089c88a2c0a4-cert\") pod \"controller-68d546b9d8-62lfz\" (UID: \"0b8fc405-8054-44d8-8213-089c88a2c0a4\") " pod="metallb-system/controller-68d546b9d8-62lfz" Oct 02 09:46:07 crc kubenswrapper[4722]: E1002 09:46:07.506639 4722 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 02 09:46:07 crc kubenswrapper[4722]: E1002 09:46:07.506665 4722 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Oct 02 09:46:07 crc kubenswrapper[4722]: E1002 09:46:07.506715 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97a5fee1-9fe2-4292-82d0-a743eff0ecbf-memberlist podName:97a5fee1-9fe2-4292-82d0-a743eff0ecbf nodeName:}" failed. No retries permitted until 2025-10-02 09:46:08.006695275 +0000 UTC m=+824.043448255 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/97a5fee1-9fe2-4292-82d0-a743eff0ecbf-memberlist") pod "speaker-clhg2" (UID: "97a5fee1-9fe2-4292-82d0-a743eff0ecbf") : secret "metallb-memberlist" not found Oct 02 09:46:07 crc kubenswrapper[4722]: E1002 09:46:07.506753 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97a5fee1-9fe2-4292-82d0-a743eff0ecbf-metrics-certs podName:97a5fee1-9fe2-4292-82d0-a743eff0ecbf nodeName:}" failed. No retries permitted until 2025-10-02 09:46:08.006722906 +0000 UTC m=+824.043475886 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/97a5fee1-9fe2-4292-82d0-a743eff0ecbf-metrics-certs") pod "speaker-clhg2" (UID: "97a5fee1-9fe2-4292-82d0-a743eff0ecbf") : secret "speaker-certs-secret" not found Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.507253 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/97a5fee1-9fe2-4292-82d0-a743eff0ecbf-metallb-excludel2\") pod \"speaker-clhg2\" (UID: \"97a5fee1-9fe2-4292-82d0-a743eff0ecbf\") " pod="metallb-system/speaker-clhg2" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.511325 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b8fc405-8054-44d8-8213-089c88a2c0a4-metrics-certs\") pod \"controller-68d546b9d8-62lfz\" (UID: \"0b8fc405-8054-44d8-8213-089c88a2c0a4\") " pod="metallb-system/controller-68d546b9d8-62lfz" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.513103 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0b8fc405-8054-44d8-8213-089c88a2c0a4-cert\") pod \"controller-68d546b9d8-62lfz\" (UID: \"0b8fc405-8054-44d8-8213-089c88a2c0a4\") " pod="metallb-system/controller-68d546b9d8-62lfz" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.534672 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l7sn\" (UniqueName: \"kubernetes.io/projected/97a5fee1-9fe2-4292-82d0-a743eff0ecbf-kube-api-access-5l7sn\") pod \"speaker-clhg2\" (UID: \"97a5fee1-9fe2-4292-82d0-a743eff0ecbf\") " pod="metallb-system/speaker-clhg2" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.535419 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnvf7\" (UniqueName: \"kubernetes.io/projected/0b8fc405-8054-44d8-8213-089c88a2c0a4-kube-api-access-jnvf7\") pod \"controller-68d546b9d8-62lfz\" (UID: \"0b8fc405-8054-44d8-8213-089c88a2c0a4\") " pod="metallb-system/controller-68d546b9d8-62lfz" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.636469 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-62lfz" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.910900 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c20d6613-a91d-4afc-96a1-efbdffba4415-metrics-certs\") pod \"frr-k8s-pl8wr\" (UID: \"c20d6613-a91d-4afc-96a1-efbdffba4415\") " pod="metallb-system/frr-k8s-pl8wr" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.911291 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17b8e73d-a3a0-4f67-9634-8e7f9f6038e8-cert\") pod \"frr-k8s-webhook-server-64bf5d555-zj8n7\" (UID: \"17b8e73d-a3a0-4f67-9634-8e7f9f6038e8\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zj8n7" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.914405 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c20d6613-a91d-4afc-96a1-efbdffba4415-metrics-certs\") pod \"frr-k8s-pl8wr\" (UID: \"c20d6613-a91d-4afc-96a1-efbdffba4415\") " pod="metallb-system/frr-k8s-pl8wr" Oct 02 09:46:07 crc kubenswrapper[4722]: I1002 09:46:07.915048 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17b8e73d-a3a0-4f67-9634-8e7f9f6038e8-cert\") pod \"frr-k8s-webhook-server-64bf5d555-zj8n7\" (UID: \"17b8e73d-a3a0-4f67-9634-8e7f9f6038e8\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zj8n7" Oct 02 09:46:08 crc kubenswrapper[4722]: I1002 09:46:08.013069 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/97a5fee1-9fe2-4292-82d0-a743eff0ecbf-metrics-certs\") pod \"speaker-clhg2\" (UID: \"97a5fee1-9fe2-4292-82d0-a743eff0ecbf\") " pod="metallb-system/speaker-clhg2" Oct 02 09:46:08 crc kubenswrapper[4722]: I1002 09:46:08.013185 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/97a5fee1-9fe2-4292-82d0-a743eff0ecbf-memberlist\") pod \"speaker-clhg2\" (UID: \"97a5fee1-9fe2-4292-82d0-a743eff0ecbf\") " pod="metallb-system/speaker-clhg2" Oct 02 09:46:08 crc kubenswrapper[4722]: E1002 09:46:08.013364 4722 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 02 09:46:08 crc kubenswrapper[4722]: E1002 09:46:08.013419 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97a5fee1-9fe2-4292-82d0-a743eff0ecbf-memberlist podName:97a5fee1-9fe2-4292-82d0-a743eff0ecbf nodeName:}" failed. No retries permitted until 2025-10-02 09:46:09.013402673 +0000 UTC m=+825.050155653 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/97a5fee1-9fe2-4292-82d0-a743eff0ecbf-memberlist") pod "speaker-clhg2" (UID: "97a5fee1-9fe2-4292-82d0-a743eff0ecbf") : secret "metallb-memberlist" not found Oct 02 09:46:08 crc kubenswrapper[4722]: I1002 09:46:08.017416 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/97a5fee1-9fe2-4292-82d0-a743eff0ecbf-metrics-certs\") pod \"speaker-clhg2\" (UID: \"97a5fee1-9fe2-4292-82d0-a743eff0ecbf\") " pod="metallb-system/speaker-clhg2" Oct 02 09:46:08 crc kubenswrapper[4722]: I1002 09:46:08.033397 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-62lfz"] Oct 02 09:46:08 crc kubenswrapper[4722]: I1002 09:46:08.136044 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-pl8wr" Oct 02 09:46:08 crc kubenswrapper[4722]: I1002 09:46:08.150300 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zj8n7" Oct 02 09:46:08 crc kubenswrapper[4722]: I1002 09:46:08.560840 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-zj8n7"] Oct 02 09:46:08 crc kubenswrapper[4722]: W1002 09:46:08.567753 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17b8e73d_a3a0_4f67_9634_8e7f9f6038e8.slice/crio-4ff91216f684e2df1b401b055dda9a9ac608e031950f6eb4fce9413127a2f62a WatchSource:0}: Error finding container 4ff91216f684e2df1b401b055dda9a9ac608e031950f6eb4fce9413127a2f62a: Status 404 returned error can't find the container with id 4ff91216f684e2df1b401b055dda9a9ac608e031950f6eb4fce9413127a2f62a Oct 02 09:46:08 crc kubenswrapper[4722]: I1002 09:46:08.737486 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pl8wr" event={"ID":"c20d6613-a91d-4afc-96a1-efbdffba4415","Type":"ContainerStarted","Data":"101365f2b58e31ddbbdac2a4827d80efc6c57c3d22ac5bd9fe584f3f8a51228d"} Oct 02 09:46:08 crc kubenswrapper[4722]: I1002 09:46:08.738348 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zj8n7" event={"ID":"17b8e73d-a3a0-4f67-9634-8e7f9f6038e8","Type":"ContainerStarted","Data":"4ff91216f684e2df1b401b055dda9a9ac608e031950f6eb4fce9413127a2f62a"} Oct 02 09:46:08 crc kubenswrapper[4722]: I1002 09:46:08.739891 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-62lfz" event={"ID":"0b8fc405-8054-44d8-8213-089c88a2c0a4","Type":"ContainerStarted","Data":"f36aec9285d780a605d71b7d8cab3e99ab5f4a2199bbf7dc13e11ed3d1d4d196"} Oct 02 09:46:08 crc kubenswrapper[4722]: I1002 09:46:08.739921 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-62lfz" event={"ID":"0b8fc405-8054-44d8-8213-089c88a2c0a4","Type":"ContainerStarted","Data":"b26ee53dbc13fc437bfc7665d947755201d1310dd006d4156f5c84a449e48460"} Oct 02 09:46:09 crc kubenswrapper[4722]: I1002 09:46:09.026523 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/97a5fee1-9fe2-4292-82d0-a743eff0ecbf-memberlist\") pod \"speaker-clhg2\" (UID: \"97a5fee1-9fe2-4292-82d0-a743eff0ecbf\") " pod="metallb-system/speaker-clhg2" Oct 02 09:46:09 crc kubenswrapper[4722]: I1002 09:46:09.033891 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/97a5fee1-9fe2-4292-82d0-a743eff0ecbf-memberlist\") pod \"speaker-clhg2\" (UID: \"97a5fee1-9fe2-4292-82d0-a743eff0ecbf\") " pod="metallb-system/speaker-clhg2" Oct 02 09:46:09 crc kubenswrapper[4722]: I1002 09:46:09.123421 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-clhg2" Oct 02 09:46:09 crc kubenswrapper[4722]: W1002 09:46:09.151384 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97a5fee1_9fe2_4292_82d0_a743eff0ecbf.slice/crio-9176f790eaf8e7a2241d1a691966c2e2bf04c90f4bc259c3dc420915ffde2a9f WatchSource:0}: Error finding container 9176f790eaf8e7a2241d1a691966c2e2bf04c90f4bc259c3dc420915ffde2a9f: Status 404 returned error can't find the container with id 9176f790eaf8e7a2241d1a691966c2e2bf04c90f4bc259c3dc420915ffde2a9f Oct 02 09:46:09 crc kubenswrapper[4722]: I1002 09:46:09.756008 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-clhg2" event={"ID":"97a5fee1-9fe2-4292-82d0-a743eff0ecbf","Type":"ContainerStarted","Data":"895882acfff63fe3d4bae4e1cca97c69c5adc9347fb7ce5b39a87bffd324e147"} Oct 02 09:46:09 crc kubenswrapper[4722]: I1002 09:46:09.756366 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-clhg2" event={"ID":"97a5fee1-9fe2-4292-82d0-a743eff0ecbf","Type":"ContainerStarted","Data":"9176f790eaf8e7a2241d1a691966c2e2bf04c90f4bc259c3dc420915ffde2a9f"} Oct 02 09:46:10 crc kubenswrapper[4722]: I1002 09:46:10.696683 4722 patch_prober.go:28] interesting pod/machine-config-daemon-q6h76 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 09:46:10 crc kubenswrapper[4722]: I1002 09:46:10.696742 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" podUID="f662826c-e772-4f56-a85f-83971aaef356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 09:46:10 crc kubenswrapper[4722]: I1002 09:46:10.696798 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" Oct 02 09:46:10 crc kubenswrapper[4722]: I1002 09:46:10.697480 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2e23a92796174868d03a49e3fa647f65a4cb15be5ec415c3eed20693726accb6"} pod="openshift-machine-config-operator/machine-config-daemon-q6h76" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 09:46:10 crc kubenswrapper[4722]: I1002 09:46:10.697547 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" podUID="f662826c-e772-4f56-a85f-83971aaef356" containerName="machine-config-daemon" containerID="cri-o://2e23a92796174868d03a49e3fa647f65a4cb15be5ec415c3eed20693726accb6" gracePeriod=600 Oct 02 09:46:11 crc kubenswrapper[4722]: I1002 09:46:11.776649 4722 generic.go:334] "Generic (PLEG): container finished" podID="f662826c-e772-4f56-a85f-83971aaef356" containerID="2e23a92796174868d03a49e3fa647f65a4cb15be5ec415c3eed20693726accb6" exitCode=0 Oct 02 09:46:11 crc kubenswrapper[4722]: I1002 09:46:11.776956 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" event={"ID":"f662826c-e772-4f56-a85f-83971aaef356","Type":"ContainerDied","Data":"2e23a92796174868d03a49e3fa647f65a4cb15be5ec415c3eed20693726accb6"} Oct 02 09:46:11 crc kubenswrapper[4722]: I1002 09:46:11.776995 4722 scope.go:117] "RemoveContainer" containerID="2ed22cde6265a465e12b7d2f595420982dbcb915a0dbd020faa1db13915d4fbc" Oct 02 09:46:12 crc kubenswrapper[4722]: I1002 09:46:12.783986 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-clhg2" event={"ID":"97a5fee1-9fe2-4292-82d0-a743eff0ecbf","Type":"ContainerStarted","Data":"a5f7a26329058bb6611fd57e296ffde379528f313c27e44327dde295f4b97bb0"} Oct 02 09:46:12 crc kubenswrapper[4722]: I1002 09:46:12.785887 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-62lfz" event={"ID":"0b8fc405-8054-44d8-8213-089c88a2c0a4","Type":"ContainerStarted","Data":"d0624e064ea47d75262e0c326e26ef3bb875d9e5168e81659043e8da5c0928fe"} Oct 02 09:46:12 crc kubenswrapper[4722]: I1002 09:46:12.786023 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-62lfz" Oct 02 09:46:12 crc kubenswrapper[4722]: I1002 09:46:12.788469 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" event={"ID":"f662826c-e772-4f56-a85f-83971aaef356","Type":"ContainerStarted","Data":"424c2680b223e9db678ffaf1f8b8a1f4cab14a156d3b1044d8be0e10fbad104c"} Oct 02 09:46:12 crc kubenswrapper[4722]: I1002 09:46:12.811521 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-clhg2" podStartSLOduration=3.577881439 podStartE2EDuration="5.811501011s" podCreationTimestamp="2025-10-02 09:46:07 +0000 UTC" firstStartedPulling="2025-10-02 09:46:09.471012226 +0000 UTC m=+825.507765206" lastFinishedPulling="2025-10-02 09:46:11.704631798 +0000 UTC m=+827.741384778" observedRunningTime="2025-10-02 09:46:12.809243044 +0000 UTC m=+828.845996034" watchObservedRunningTime="2025-10-02 09:46:12.811501011 +0000 UTC m=+828.848253991" Oct 02 09:46:12 crc kubenswrapper[4722]: I1002 09:46:12.847438 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-62lfz" podStartSLOduration=2.317234254 podStartE2EDuration="5.847389445s" podCreationTimestamp="2025-10-02 09:46:07 +0000 UTC" firstStartedPulling="2025-10-02 09:46:08.196428504 +0000 UTC m=+824.233181484" lastFinishedPulling="2025-10-02 09:46:11.726583695 +0000 UTC m=+827.763336675" observedRunningTime="2025-10-02 09:46:12.826889034 +0000 UTC m=+828.863642034" watchObservedRunningTime="2025-10-02 09:46:12.847389445 +0000 UTC m=+828.884142445" Oct 02 09:46:13 crc kubenswrapper[4722]: I1002 09:46:13.793212 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-clhg2" Oct 02 09:46:16 crc kubenswrapper[4722]: I1002 09:46:16.809514 4722 generic.go:334] "Generic (PLEG): container finished" podID="c20d6613-a91d-4afc-96a1-efbdffba4415" containerID="ed3316bbd5823ff919b585ced9c2d4694eeaad3f491aca22c0cae74787261a3d" exitCode=0 Oct 02 09:46:16 crc kubenswrapper[4722]: I1002 09:46:16.809561 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pl8wr" event={"ID":"c20d6613-a91d-4afc-96a1-efbdffba4415","Type":"ContainerDied","Data":"ed3316bbd5823ff919b585ced9c2d4694eeaad3f491aca22c0cae74787261a3d"} Oct 02 09:46:16 crc kubenswrapper[4722]: I1002 09:46:16.812056 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zj8n7" event={"ID":"17b8e73d-a3a0-4f67-9634-8e7f9f6038e8","Type":"ContainerStarted","Data":"eab6de34226fd6069be7cdf95f3e01168a56674d0e36bdaa55b561f6993dff6d"} Oct 02 09:46:16 crc kubenswrapper[4722]: I1002 09:46:16.812206 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zj8n7" Oct 02 09:46:16 crc kubenswrapper[4722]: I1002 09:46:16.865507 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zj8n7" podStartSLOduration=2.788715363 podStartE2EDuration="9.865487126s" podCreationTimestamp="2025-10-02 09:46:07 +0000 UTC" firstStartedPulling="2025-10-02 09:46:08.571195202 +0000 UTC m=+824.607948182" lastFinishedPulling="2025-10-02 09:46:15.647966965 +0000 UTC m=+831.684719945" observedRunningTime="2025-10-02 09:46:16.862540763 +0000 UTC m=+832.899293763" watchObservedRunningTime="2025-10-02 09:46:16.865487126 +0000 UTC m=+832.902240106" Oct 02 09:46:17 crc kubenswrapper[4722]: I1002 09:46:17.821368 4722 generic.go:334] "Generic (PLEG): container finished" podID="c20d6613-a91d-4afc-96a1-efbdffba4415" containerID="cdc028e73a4f17f442d621820982b8c98318acb08aa8b0b98226e533f3d080bf" exitCode=0 Oct 02 09:46:17 crc kubenswrapper[4722]: I1002 09:46:17.821477 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pl8wr" event={"ID":"c20d6613-a91d-4afc-96a1-efbdffba4415","Type":"ContainerDied","Data":"cdc028e73a4f17f442d621820982b8c98318acb08aa8b0b98226e533f3d080bf"} Oct 02 09:46:18 crc kubenswrapper[4722]: I1002 09:46:18.828729 4722 generic.go:334] "Generic (PLEG): container finished" podID="c20d6613-a91d-4afc-96a1-efbdffba4415" containerID="06261843ddc1a6fac9816d5e25e8360837fa17599d186d4c78fc95ee659a11d9" exitCode=0 Oct 02 09:46:18 crc kubenswrapper[4722]: I1002 09:46:18.828778 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pl8wr" event={"ID":"c20d6613-a91d-4afc-96a1-efbdffba4415","Type":"ContainerDied","Data":"06261843ddc1a6fac9816d5e25e8360837fa17599d186d4c78fc95ee659a11d9"} Oct 02 09:46:19 crc kubenswrapper[4722]: I1002 09:46:19.128014 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-clhg2" Oct 02 09:46:19 crc kubenswrapper[4722]: I1002 09:46:19.241421 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8bchh"] Oct 02 09:46:19 crc kubenswrapper[4722]: I1002 09:46:19.242902 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8bchh" Oct 02 09:46:19 crc kubenswrapper[4722]: I1002 09:46:19.262635 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8bchh"] Oct 02 09:46:19 crc kubenswrapper[4722]: I1002 09:46:19.374288 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e5d7413-99c0-47f6-9d88-79798e9f6225-catalog-content\") pod \"community-operators-8bchh\" (UID: \"6e5d7413-99c0-47f6-9d88-79798e9f6225\") " pod="openshift-marketplace/community-operators-8bchh" Oct 02 09:46:19 crc kubenswrapper[4722]: I1002 09:46:19.374441 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e5d7413-99c0-47f6-9d88-79798e9f6225-utilities\") pod \"community-operators-8bchh\" (UID: \"6e5d7413-99c0-47f6-9d88-79798e9f6225\") " pod="openshift-marketplace/community-operators-8bchh" Oct 02 09:46:19 crc kubenswrapper[4722]: I1002 09:46:19.374484 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gxwq\" (UniqueName: \"kubernetes.io/projected/6e5d7413-99c0-47f6-9d88-79798e9f6225-kube-api-access-2gxwq\") pod \"community-operators-8bchh\" (UID: \"6e5d7413-99c0-47f6-9d88-79798e9f6225\") " pod="openshift-marketplace/community-operators-8bchh" Oct 02 09:46:19 crc kubenswrapper[4722]: I1002 09:46:19.475732 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e5d7413-99c0-47f6-9d88-79798e9f6225-utilities\") pod \"community-operators-8bchh\" (UID: \"6e5d7413-99c0-47f6-9d88-79798e9f6225\") " pod="openshift-marketplace/community-operators-8bchh" Oct 02 09:46:19 crc kubenswrapper[4722]: I1002 09:46:19.475804 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gxwq\" (UniqueName: \"kubernetes.io/projected/6e5d7413-99c0-47f6-9d88-79798e9f6225-kube-api-access-2gxwq\") pod \"community-operators-8bchh\" (UID: \"6e5d7413-99c0-47f6-9d88-79798e9f6225\") " pod="openshift-marketplace/community-operators-8bchh" Oct 02 09:46:19 crc kubenswrapper[4722]: I1002 09:46:19.475838 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e5d7413-99c0-47f6-9d88-79798e9f6225-catalog-content\") pod \"community-operators-8bchh\" (UID: \"6e5d7413-99c0-47f6-9d88-79798e9f6225\") " pod="openshift-marketplace/community-operators-8bchh" Oct 02 09:46:19 crc kubenswrapper[4722]: I1002 09:46:19.476300 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e5d7413-99c0-47f6-9d88-79798e9f6225-utilities\") pod \"community-operators-8bchh\" (UID: \"6e5d7413-99c0-47f6-9d88-79798e9f6225\") " pod="openshift-marketplace/community-operators-8bchh" Oct 02 09:46:19 crc kubenswrapper[4722]: I1002 09:46:19.476552 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e5d7413-99c0-47f6-9d88-79798e9f6225-catalog-content\") pod \"community-operators-8bchh\" (UID: \"6e5d7413-99c0-47f6-9d88-79798e9f6225\") " pod="openshift-marketplace/community-operators-8bchh" Oct 02 09:46:19 crc kubenswrapper[4722]: I1002 09:46:19.498741 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gxwq\" (UniqueName: \"kubernetes.io/projected/6e5d7413-99c0-47f6-9d88-79798e9f6225-kube-api-access-2gxwq\") pod \"community-operators-8bchh\" (UID: \"6e5d7413-99c0-47f6-9d88-79798e9f6225\") " pod="openshift-marketplace/community-operators-8bchh" Oct 02 09:46:19 crc kubenswrapper[4722]: I1002 09:46:19.561390 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8bchh" Oct 02 09:46:19 crc kubenswrapper[4722]: I1002 09:46:19.866935 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8bchh"] Oct 02 09:46:19 crc kubenswrapper[4722]: I1002 09:46:19.876965 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pl8wr" event={"ID":"c20d6613-a91d-4afc-96a1-efbdffba4415","Type":"ContainerStarted","Data":"f665dc2972d35dcf22f5163ba4c9f9ce7c4380249ec686f436ef4be042b014b9"} Oct 02 09:46:19 crc kubenswrapper[4722]: I1002 09:46:19.877016 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pl8wr" event={"ID":"c20d6613-a91d-4afc-96a1-efbdffba4415","Type":"ContainerStarted","Data":"92aac999a11f3faecbbc936caf02fbb82674d1d42aac72031f11a0989e3c0f2b"} Oct 02 09:46:19 crc kubenswrapper[4722]: I1002 09:46:19.877038 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pl8wr" event={"ID":"c20d6613-a91d-4afc-96a1-efbdffba4415","Type":"ContainerStarted","Data":"de7d47041734831d7cfb1f90a7ba3775ddde18418e2ba5a9659337fc5d4660fa"} Oct 02 09:46:19 crc kubenswrapper[4722]: I1002 09:46:19.877052 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pl8wr" event={"ID":"c20d6613-a91d-4afc-96a1-efbdffba4415","Type":"ContainerStarted","Data":"498ac92966bef73f34a31e35e7e0ac352d910e9df26d078d9bef5e373842add4"} Oct 02 09:46:19 crc kubenswrapper[4722]: I1002 09:46:19.877065 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pl8wr" event={"ID":"c20d6613-a91d-4afc-96a1-efbdffba4415","Type":"ContainerStarted","Data":"f3b807f850658efb56ff357259bd33d659f6878a3a17b919c89c4352628dc1ee"} Oct 02 09:46:20 crc kubenswrapper[4722]: I1002 09:46:20.894634 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pl8wr" event={"ID":"c20d6613-a91d-4afc-96a1-efbdffba4415","Type":"ContainerStarted","Data":"48234b3942e45744849c9c7482b75a7e9a43adf667c778509f3ca956c68a247d"} Oct 02 09:46:20 crc kubenswrapper[4722]: I1002 09:46:20.896382 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-pl8wr" Oct 02 09:46:20 crc kubenswrapper[4722]: I1002 09:46:20.899350 4722 generic.go:334] "Generic (PLEG): container finished" podID="6e5d7413-99c0-47f6-9d88-79798e9f6225" containerID="938c29f57ed246042ff3aeb829302ac6290d777b834e731f8e23fea449b97a15" exitCode=0 Oct 02 09:46:20 crc kubenswrapper[4722]: I1002 09:46:20.899397 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8bchh" event={"ID":"6e5d7413-99c0-47f6-9d88-79798e9f6225","Type":"ContainerDied","Data":"938c29f57ed246042ff3aeb829302ac6290d777b834e731f8e23fea449b97a15"} Oct 02 09:46:20 crc kubenswrapper[4722]: I1002 09:46:20.899457 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8bchh" event={"ID":"6e5d7413-99c0-47f6-9d88-79798e9f6225","Type":"ContainerStarted","Data":"7e76fa916af61b22798f528c31be9cfd05e40aee34c442b9f9a0ec464e76cfb0"} Oct 02 09:46:20 crc kubenswrapper[4722]: I1002 09:46:20.922879 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-pl8wr" podStartSLOduration=6.501997339 podStartE2EDuration="13.922857186s" podCreationTimestamp="2025-10-02 09:46:07 +0000 UTC" firstStartedPulling="2025-10-02 09:46:08.244986444 +0000 UTC m=+824.281739424" lastFinishedPulling="2025-10-02 09:46:15.665846291 +0000 UTC m=+831.702599271" observedRunningTime="2025-10-02 09:46:20.921895552 +0000 UTC m=+836.958648542" watchObservedRunningTime="2025-10-02 09:46:20.922857186 +0000 UTC m=+836.959610176" Oct 02 09:46:21 crc kubenswrapper[4722]: I1002 09:46:21.908563 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8bchh" event={"ID":"6e5d7413-99c0-47f6-9d88-79798e9f6225","Type":"ContainerStarted","Data":"84d6963ddcb812883d408f4871b6fbb9af45da7524a08d119b2453bdbcbc8d5e"} Oct 02 09:46:22 crc kubenswrapper[4722]: I1002 09:46:22.027050 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-szbtd"] Oct 02 09:46:22 crc kubenswrapper[4722]: I1002 09:46:22.028459 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-szbtd" Oct 02 09:46:22 crc kubenswrapper[4722]: I1002 09:46:22.040041 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-szbtd"] Oct 02 09:46:22 crc kubenswrapper[4722]: I1002 09:46:22.221574 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f36fadf5-d108-4d4b-8985-d9450d8a4368-catalog-content\") pod \"redhat-marketplace-szbtd\" (UID: \"f36fadf5-d108-4d4b-8985-d9450d8a4368\") " pod="openshift-marketplace/redhat-marketplace-szbtd" Oct 02 09:46:22 crc kubenswrapper[4722]: I1002 09:46:22.221725 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkhn5\" (UniqueName: \"kubernetes.io/projected/f36fadf5-d108-4d4b-8985-d9450d8a4368-kube-api-access-wkhn5\") pod \"redhat-marketplace-szbtd\" (UID: \"f36fadf5-d108-4d4b-8985-d9450d8a4368\") " pod="openshift-marketplace/redhat-marketplace-szbtd" Oct 02 09:46:22 crc kubenswrapper[4722]: I1002 09:46:22.221776 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f36fadf5-d108-4d4b-8985-d9450d8a4368-utilities\") pod \"redhat-marketplace-szbtd\" (UID: \"f36fadf5-d108-4d4b-8985-d9450d8a4368\") " pod="openshift-marketplace/redhat-marketplace-szbtd" Oct 02 09:46:22 crc kubenswrapper[4722]: I1002 09:46:22.322605 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f36fadf5-d108-4d4b-8985-d9450d8a4368-catalog-content\") pod \"redhat-marketplace-szbtd\" (UID: \"f36fadf5-d108-4d4b-8985-d9450d8a4368\") " pod="openshift-marketplace/redhat-marketplace-szbtd" Oct 02 09:46:22 crc kubenswrapper[4722]: I1002 09:46:22.322677 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkhn5\" (UniqueName: \"kubernetes.io/projected/f36fadf5-d108-4d4b-8985-d9450d8a4368-kube-api-access-wkhn5\") pod \"redhat-marketplace-szbtd\" (UID: \"f36fadf5-d108-4d4b-8985-d9450d8a4368\") " pod="openshift-marketplace/redhat-marketplace-szbtd" Oct 02 09:46:22 crc kubenswrapper[4722]: I1002 09:46:22.322722 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f36fadf5-d108-4d4b-8985-d9450d8a4368-utilities\") pod \"redhat-marketplace-szbtd\" (UID: \"f36fadf5-d108-4d4b-8985-d9450d8a4368\") " pod="openshift-marketplace/redhat-marketplace-szbtd" Oct 02 09:46:22 crc kubenswrapper[4722]: I1002 09:46:22.323406 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f36fadf5-d108-4d4b-8985-d9450d8a4368-utilities\") pod \"redhat-marketplace-szbtd\" (UID: \"f36fadf5-d108-4d4b-8985-d9450d8a4368\") " pod="openshift-marketplace/redhat-marketplace-szbtd" Oct 02 09:46:22 crc kubenswrapper[4722]: I1002 09:46:22.323407 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f36fadf5-d108-4d4b-8985-d9450d8a4368-catalog-content\") pod \"redhat-marketplace-szbtd\" (UID: \"f36fadf5-d108-4d4b-8985-d9450d8a4368\") " pod="openshift-marketplace/redhat-marketplace-szbtd" Oct 02 09:46:22 crc kubenswrapper[4722]: I1002 09:46:22.343591 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkhn5\" (UniqueName: \"kubernetes.io/projected/f36fadf5-d108-4d4b-8985-d9450d8a4368-kube-api-access-wkhn5\") pod \"redhat-marketplace-szbtd\" (UID: \"f36fadf5-d108-4d4b-8985-d9450d8a4368\") " pod="openshift-marketplace/redhat-marketplace-szbtd" Oct 02 09:46:22 crc kubenswrapper[4722]: I1002 09:46:22.353955 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-szbtd" Oct 02 09:46:22 crc kubenswrapper[4722]: I1002 09:46:22.825253 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-szbtd"] Oct 02 09:46:22 crc kubenswrapper[4722]: I1002 09:46:22.919476 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szbtd" event={"ID":"f36fadf5-d108-4d4b-8985-d9450d8a4368","Type":"ContainerStarted","Data":"c1daa99b8019fcb7d0c7664c22b17f2e47c8cf1e47c025ab3592c5ebb40e6a2a"} Oct 02 09:46:22 crc kubenswrapper[4722]: I1002 09:46:22.922375 4722 generic.go:334] "Generic (PLEG): container finished" podID="6e5d7413-99c0-47f6-9d88-79798e9f6225" containerID="84d6963ddcb812883d408f4871b6fbb9af45da7524a08d119b2453bdbcbc8d5e" exitCode=0 Oct 02 09:46:22 crc kubenswrapper[4722]: I1002 09:46:22.922481 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8bchh" event={"ID":"6e5d7413-99c0-47f6-9d88-79798e9f6225","Type":"ContainerDied","Data":"84d6963ddcb812883d408f4871b6fbb9af45da7524a08d119b2453bdbcbc8d5e"} Oct 02 09:46:23 crc kubenswrapper[4722]: I1002 09:46:23.137331 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-pl8wr" Oct 02 09:46:23 crc kubenswrapper[4722]: I1002 09:46:23.213993 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-pl8wr" Oct 02 09:46:23 crc kubenswrapper[4722]: I1002 09:46:23.932893 4722 generic.go:334] "Generic (PLEG): container finished" podID="f36fadf5-d108-4d4b-8985-d9450d8a4368" containerID="fd1853aaa1ba884089e453ec1af8a751f2ed846f5db67c6ddd005d22bc455bf5" exitCode=0 Oct 02 09:46:23 crc kubenswrapper[4722]: I1002 09:46:23.933022 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szbtd" event={"ID":"f36fadf5-d108-4d4b-8985-d9450d8a4368","Type":"ContainerDied","Data":"fd1853aaa1ba884089e453ec1af8a751f2ed846f5db67c6ddd005d22bc455bf5"} Oct 02 09:46:24 crc kubenswrapper[4722]: I1002 09:46:24.944694 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8bchh" event={"ID":"6e5d7413-99c0-47f6-9d88-79798e9f6225","Type":"ContainerStarted","Data":"1093dbaad630edf2c37938b935ce3d6a6aed3a5b259a44c1b51b572649e3d0f5"} Oct 02 09:46:24 crc kubenswrapper[4722]: I1002 09:46:24.969026 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8bchh" podStartSLOduration=2.864894222 podStartE2EDuration="5.969002577s" podCreationTimestamp="2025-10-02 09:46:19 +0000 UTC" firstStartedPulling="2025-10-02 09:46:20.901926344 +0000 UTC m=+836.938679334" lastFinishedPulling="2025-10-02 09:46:24.006034709 +0000 UTC m=+840.042787689" observedRunningTime="2025-10-02 09:46:24.965749846 +0000 UTC m=+841.002502846" watchObservedRunningTime="2025-10-02 09:46:24.969002577 +0000 UTC m=+841.005755557" Oct 02 09:46:25 crc kubenswrapper[4722]: I1002 09:46:25.951693 4722 generic.go:334] "Generic (PLEG): container finished" podID="f36fadf5-d108-4d4b-8985-d9450d8a4368" containerID="f2f7a14180c2c1c56e202eb0a26909bfcbe75ff99afe43e2c313e3ab13eacfe8" exitCode=0 Oct 02 09:46:25 crc kubenswrapper[4722]: I1002 09:46:25.951919 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szbtd" event={"ID":"f36fadf5-d108-4d4b-8985-d9450d8a4368","Type":"ContainerDied","Data":"f2f7a14180c2c1c56e202eb0a26909bfcbe75ff99afe43e2c313e3ab13eacfe8"} Oct 02 09:46:26 crc kubenswrapper[4722]: I1002 09:46:26.960718 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szbtd" event={"ID":"f36fadf5-d108-4d4b-8985-d9450d8a4368","Type":"ContainerStarted","Data":"8771df94fea9be76942b1f83250b6567a01a5fede7940f71ceb76216784c99f3"} Oct 02 09:46:26 crc kubenswrapper[4722]: I1002 09:46:26.985618 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-szbtd" podStartSLOduration=2.570496796 podStartE2EDuration="4.98560231s" podCreationTimestamp="2025-10-02 09:46:22 +0000 UTC" firstStartedPulling="2025-10-02 09:46:24.005176998 +0000 UTC m=+840.041929978" lastFinishedPulling="2025-10-02 09:46:26.420282512 +0000 UTC m=+842.457035492" observedRunningTime="2025-10-02 09:46:26.982531433 +0000 UTC m=+843.019284413" watchObservedRunningTime="2025-10-02 09:46:26.98560231 +0000 UTC m=+843.022355290" Oct 02 09:46:27 crc kubenswrapper[4722]: I1002 09:46:27.640771 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-62lfz" Oct 02 09:46:28 crc kubenswrapper[4722]: I1002 09:46:28.140494 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-pl8wr" Oct 02 09:46:28 crc kubenswrapper[4722]: I1002 09:46:28.156727 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zj8n7" Oct 02 09:46:29 crc kubenswrapper[4722]: I1002 09:46:29.562909 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8bchh" Oct 02 09:46:29 crc kubenswrapper[4722]: I1002 09:46:29.562958 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8bchh" Oct 02 09:46:29 crc kubenswrapper[4722]: I1002 09:46:29.607994 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8bchh" Oct 02 09:46:30 crc kubenswrapper[4722]: I1002 09:46:30.026277 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-xzsh7"] Oct 02 09:46:30 crc kubenswrapper[4722]: I1002 09:46:30.028228 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-xzsh7" Oct 02 09:46:30 crc kubenswrapper[4722]: I1002 09:46:30.031114 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 02 09:46:30 crc kubenswrapper[4722]: I1002 09:46:30.031166 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 02 09:46:30 crc kubenswrapper[4722]: I1002 09:46:30.036336 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8bchh" Oct 02 09:46:30 crc kubenswrapper[4722]: I1002 09:46:30.036416 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-index-dockercfg-x6gm2" Oct 02 09:46:30 crc kubenswrapper[4722]: I1002 09:46:30.036768 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-xzsh7"] Oct 02 09:46:30 crc kubenswrapper[4722]: I1002 09:46:30.049980 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrzdd\" (UniqueName: \"kubernetes.io/projected/dd2b9c64-b51a-46b6-bbc7-c9289da3773a-kube-api-access-xrzdd\") pod \"mariadb-operator-index-xzsh7\" (UID: \"dd2b9c64-b51a-46b6-bbc7-c9289da3773a\") " pod="openstack-operators/mariadb-operator-index-xzsh7" Oct 02 09:46:30 crc kubenswrapper[4722]: I1002 09:46:30.151735 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrzdd\" (UniqueName: \"kubernetes.io/projected/dd2b9c64-b51a-46b6-bbc7-c9289da3773a-kube-api-access-xrzdd\") pod \"mariadb-operator-index-xzsh7\" (UID: \"dd2b9c64-b51a-46b6-bbc7-c9289da3773a\") " pod="openstack-operators/mariadb-operator-index-xzsh7" Oct 02 09:46:30 crc kubenswrapper[4722]: I1002 09:46:30.181457 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrzdd\" (UniqueName: \"kubernetes.io/projected/dd2b9c64-b51a-46b6-bbc7-c9289da3773a-kube-api-access-xrzdd\") pod \"mariadb-operator-index-xzsh7\" (UID: \"dd2b9c64-b51a-46b6-bbc7-c9289da3773a\") " pod="openstack-operators/mariadb-operator-index-xzsh7" Oct 02 09:46:30 crc kubenswrapper[4722]: I1002 09:46:30.345617 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-xzsh7" Oct 02 09:46:30 crc kubenswrapper[4722]: I1002 09:46:30.759940 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-xzsh7"] Oct 02 09:46:30 crc kubenswrapper[4722]: W1002 09:46:30.766543 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd2b9c64_b51a_46b6_bbc7_c9289da3773a.slice/crio-2349958cf755ae9429adc5951a4d5decf409861d5a4e9ea5fdc2b85eafd0b0ae WatchSource:0}: Error finding container 2349958cf755ae9429adc5951a4d5decf409861d5a4e9ea5fdc2b85eafd0b0ae: Status 404 returned error can't find the container with id 2349958cf755ae9429adc5951a4d5decf409861d5a4e9ea5fdc2b85eafd0b0ae Oct 02 09:46:30 crc kubenswrapper[4722]: I1002 09:46:30.989142 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-xzsh7" event={"ID":"dd2b9c64-b51a-46b6-bbc7-c9289da3773a","Type":"ContainerStarted","Data":"2349958cf755ae9429adc5951a4d5decf409861d5a4e9ea5fdc2b85eafd0b0ae"} Oct 02 09:46:32 crc kubenswrapper[4722]: I1002 09:46:32.354794 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-szbtd" Oct 02 09:46:32 crc kubenswrapper[4722]: I1002 09:46:32.354913 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-szbtd" Oct 02 09:46:32 crc kubenswrapper[4722]: I1002 09:46:32.412208 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-szbtd" Oct 02 09:46:33 crc kubenswrapper[4722]: I1002 09:46:33.047520 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-szbtd" Oct 02 09:46:35 crc kubenswrapper[4722]: I1002 09:46:35.023294 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8bchh"] Oct 02 09:46:35 crc kubenswrapper[4722]: I1002 09:46:35.024164 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8bchh" podUID="6e5d7413-99c0-47f6-9d88-79798e9f6225" containerName="registry-server" containerID="cri-o://1093dbaad630edf2c37938b935ce3d6a6aed3a5b259a44c1b51b572649e3d0f5" gracePeriod=2 Oct 02 09:46:35 crc kubenswrapper[4722]: I1002 09:46:35.726165 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8bchh" Oct 02 09:46:35 crc kubenswrapper[4722]: I1002 09:46:35.866487 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gxwq\" (UniqueName: \"kubernetes.io/projected/6e5d7413-99c0-47f6-9d88-79798e9f6225-kube-api-access-2gxwq\") pod \"6e5d7413-99c0-47f6-9d88-79798e9f6225\" (UID: \"6e5d7413-99c0-47f6-9d88-79798e9f6225\") " Oct 02 09:46:35 crc kubenswrapper[4722]: I1002 09:46:35.866554 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e5d7413-99c0-47f6-9d88-79798e9f6225-catalog-content\") pod \"6e5d7413-99c0-47f6-9d88-79798e9f6225\" (UID: \"6e5d7413-99c0-47f6-9d88-79798e9f6225\") " Oct 02 09:46:35 crc kubenswrapper[4722]: I1002 09:46:35.866646 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e5d7413-99c0-47f6-9d88-79798e9f6225-utilities\") pod \"6e5d7413-99c0-47f6-9d88-79798e9f6225\" (UID: \"6e5d7413-99c0-47f6-9d88-79798e9f6225\") " Oct 02 09:46:35 crc kubenswrapper[4722]: I1002 09:46:35.867350 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e5d7413-99c0-47f6-9d88-79798e9f6225-utilities" (OuterVolumeSpecName: "utilities") pod "6e5d7413-99c0-47f6-9d88-79798e9f6225" (UID: "6e5d7413-99c0-47f6-9d88-79798e9f6225"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:46:35 crc kubenswrapper[4722]: I1002 09:46:35.871692 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e5d7413-99c0-47f6-9d88-79798e9f6225-kube-api-access-2gxwq" (OuterVolumeSpecName: "kube-api-access-2gxwq") pod "6e5d7413-99c0-47f6-9d88-79798e9f6225" (UID: "6e5d7413-99c0-47f6-9d88-79798e9f6225"). InnerVolumeSpecName "kube-api-access-2gxwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:46:35 crc kubenswrapper[4722]: I1002 09:46:35.914095 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e5d7413-99c0-47f6-9d88-79798e9f6225-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6e5d7413-99c0-47f6-9d88-79798e9f6225" (UID: "6e5d7413-99c0-47f6-9d88-79798e9f6225"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:46:35 crc kubenswrapper[4722]: I1002 09:46:35.969200 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gxwq\" (UniqueName: \"kubernetes.io/projected/6e5d7413-99c0-47f6-9d88-79798e9f6225-kube-api-access-2gxwq\") on node \"crc\" DevicePath \"\"" Oct 02 09:46:35 crc kubenswrapper[4722]: I1002 09:46:35.969270 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e5d7413-99c0-47f6-9d88-79798e9f6225-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 09:46:35 crc kubenswrapper[4722]: I1002 09:46:35.969286 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e5d7413-99c0-47f6-9d88-79798e9f6225-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 09:46:36 crc kubenswrapper[4722]: I1002 09:46:36.048010 4722 generic.go:334] "Generic (PLEG): container finished" podID="6e5d7413-99c0-47f6-9d88-79798e9f6225" containerID="1093dbaad630edf2c37938b935ce3d6a6aed3a5b259a44c1b51b572649e3d0f5" exitCode=0 Oct 02 09:46:36 crc kubenswrapper[4722]: I1002 09:46:36.048076 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8bchh" event={"ID":"6e5d7413-99c0-47f6-9d88-79798e9f6225","Type":"ContainerDied","Data":"1093dbaad630edf2c37938b935ce3d6a6aed3a5b259a44c1b51b572649e3d0f5"} Oct 02 09:46:36 crc kubenswrapper[4722]: I1002 09:46:36.048118 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8bchh" event={"ID":"6e5d7413-99c0-47f6-9d88-79798e9f6225","Type":"ContainerDied","Data":"7e76fa916af61b22798f528c31be9cfd05e40aee34c442b9f9a0ec464e76cfb0"} Oct 02 09:46:36 crc kubenswrapper[4722]: I1002 09:46:36.048140 4722 scope.go:117] "RemoveContainer" containerID="1093dbaad630edf2c37938b935ce3d6a6aed3a5b259a44c1b51b572649e3d0f5" Oct 02 09:46:36 crc kubenswrapper[4722]: I1002 09:46:36.048168 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8bchh" Oct 02 09:46:36 crc kubenswrapper[4722]: I1002 09:46:36.083948 4722 scope.go:117] "RemoveContainer" containerID="84d6963ddcb812883d408f4871b6fbb9af45da7524a08d119b2453bdbcbc8d5e" Oct 02 09:46:36 crc kubenswrapper[4722]: I1002 09:46:36.112854 4722 scope.go:117] "RemoveContainer" containerID="938c29f57ed246042ff3aeb829302ac6290d777b834e731f8e23fea449b97a15" Oct 02 09:46:36 crc kubenswrapper[4722]: I1002 09:46:36.113669 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8bchh"] Oct 02 09:46:36 crc kubenswrapper[4722]: I1002 09:46:36.117170 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8bchh"] Oct 02 09:46:36 crc kubenswrapper[4722]: I1002 09:46:36.147839 4722 scope.go:117] "RemoveContainer" containerID="1093dbaad630edf2c37938b935ce3d6a6aed3a5b259a44c1b51b572649e3d0f5" Oct 02 09:46:36 crc kubenswrapper[4722]: E1002 09:46:36.148404 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1093dbaad630edf2c37938b935ce3d6a6aed3a5b259a44c1b51b572649e3d0f5\": container with ID starting with 1093dbaad630edf2c37938b935ce3d6a6aed3a5b259a44c1b51b572649e3d0f5 not found: ID does not exist" containerID="1093dbaad630edf2c37938b935ce3d6a6aed3a5b259a44c1b51b572649e3d0f5" Oct 02 09:46:36 crc kubenswrapper[4722]: I1002 09:46:36.148444 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1093dbaad630edf2c37938b935ce3d6a6aed3a5b259a44c1b51b572649e3d0f5"} err="failed to get container status \"1093dbaad630edf2c37938b935ce3d6a6aed3a5b259a44c1b51b572649e3d0f5\": rpc error: code = NotFound desc = could not find container \"1093dbaad630edf2c37938b935ce3d6a6aed3a5b259a44c1b51b572649e3d0f5\": container with ID starting with 1093dbaad630edf2c37938b935ce3d6a6aed3a5b259a44c1b51b572649e3d0f5 not found: ID does not exist" Oct 02 09:46:36 crc kubenswrapper[4722]: I1002 09:46:36.148474 4722 scope.go:117] "RemoveContainer" containerID="84d6963ddcb812883d408f4871b6fbb9af45da7524a08d119b2453bdbcbc8d5e" Oct 02 09:46:36 crc kubenswrapper[4722]: E1002 09:46:36.149107 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84d6963ddcb812883d408f4871b6fbb9af45da7524a08d119b2453bdbcbc8d5e\": container with ID starting with 84d6963ddcb812883d408f4871b6fbb9af45da7524a08d119b2453bdbcbc8d5e not found: ID does not exist" containerID="84d6963ddcb812883d408f4871b6fbb9af45da7524a08d119b2453bdbcbc8d5e" Oct 02 09:46:36 crc kubenswrapper[4722]: I1002 09:46:36.149179 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84d6963ddcb812883d408f4871b6fbb9af45da7524a08d119b2453bdbcbc8d5e"} err="failed to get container status \"84d6963ddcb812883d408f4871b6fbb9af45da7524a08d119b2453bdbcbc8d5e\": rpc error: code = NotFound desc = could not find container \"84d6963ddcb812883d408f4871b6fbb9af45da7524a08d119b2453bdbcbc8d5e\": container with ID starting with 84d6963ddcb812883d408f4871b6fbb9af45da7524a08d119b2453bdbcbc8d5e not found: ID does not exist" Oct 02 09:46:36 crc kubenswrapper[4722]: I1002 09:46:36.149230 4722 scope.go:117] "RemoveContainer" containerID="938c29f57ed246042ff3aeb829302ac6290d777b834e731f8e23fea449b97a15" Oct 02 09:46:36 crc kubenswrapper[4722]: E1002 09:46:36.150048 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"938c29f57ed246042ff3aeb829302ac6290d777b834e731f8e23fea449b97a15\": container with ID starting with 938c29f57ed246042ff3aeb829302ac6290d777b834e731f8e23fea449b97a15 not found: ID does not exist" containerID="938c29f57ed246042ff3aeb829302ac6290d777b834e731f8e23fea449b97a15" Oct 02 09:46:36 crc kubenswrapper[4722]: I1002 09:46:36.150079 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"938c29f57ed246042ff3aeb829302ac6290d777b834e731f8e23fea449b97a15"} err="failed to get container status \"938c29f57ed246042ff3aeb829302ac6290d777b834e731f8e23fea449b97a15\": rpc error: code = NotFound desc = could not find container \"938c29f57ed246042ff3aeb829302ac6290d777b834e731f8e23fea449b97a15\": container with ID starting with 938c29f57ed246042ff3aeb829302ac6290d777b834e731f8e23fea449b97a15 not found: ID does not exist" Oct 02 09:46:36 crc kubenswrapper[4722]: I1002 09:46:36.726755 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e5d7413-99c0-47f6-9d88-79798e9f6225" path="/var/lib/kubelet/pods/6e5d7413-99c0-47f6-9d88-79798e9f6225/volumes" Oct 02 09:46:37 crc kubenswrapper[4722]: I1002 09:46:37.057090 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-xzsh7" event={"ID":"dd2b9c64-b51a-46b6-bbc7-c9289da3773a","Type":"ContainerStarted","Data":"b1b0da16a7af0cce8d10414c5a9b4867483e59be4a799e7db0ee30736d01e171"} Oct 02 09:46:37 crc kubenswrapper[4722]: I1002 09:46:37.086788 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-xzsh7" podStartSLOduration=1.84398194 podStartE2EDuration="7.086759971s" podCreationTimestamp="2025-10-02 09:46:30 +0000 UTC" firstStartedPulling="2025-10-02 09:46:30.768945121 +0000 UTC m=+846.805698101" lastFinishedPulling="2025-10-02 09:46:36.011723152 +0000 UTC m=+852.048476132" observedRunningTime="2025-10-02 09:46:37.0790742 +0000 UTC m=+853.115827200" watchObservedRunningTime="2025-10-02 09:46:37.086759971 +0000 UTC m=+853.123512961" Oct 02 09:46:37 crc kubenswrapper[4722]: I1002 09:46:37.215264 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-szbtd"] Oct 02 09:46:37 crc kubenswrapper[4722]: I1002 09:46:37.215528 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-szbtd" podUID="f36fadf5-d108-4d4b-8985-d9450d8a4368" containerName="registry-server" containerID="cri-o://8771df94fea9be76942b1f83250b6567a01a5fede7940f71ceb76216784c99f3" gracePeriod=2 Oct 02 09:46:37 crc kubenswrapper[4722]: I1002 09:46:37.618366 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-szbtd" Oct 02 09:46:37 crc kubenswrapper[4722]: I1002 09:46:37.795768 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f36fadf5-d108-4d4b-8985-d9450d8a4368-utilities\") pod \"f36fadf5-d108-4d4b-8985-d9450d8a4368\" (UID: \"f36fadf5-d108-4d4b-8985-d9450d8a4368\") " Oct 02 09:46:37 crc kubenswrapper[4722]: I1002 09:46:37.795867 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f36fadf5-d108-4d4b-8985-d9450d8a4368-catalog-content\") pod \"f36fadf5-d108-4d4b-8985-d9450d8a4368\" (UID: \"f36fadf5-d108-4d4b-8985-d9450d8a4368\") " Oct 02 09:46:37 crc kubenswrapper[4722]: I1002 09:46:37.795991 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkhn5\" (UniqueName: \"kubernetes.io/projected/f36fadf5-d108-4d4b-8985-d9450d8a4368-kube-api-access-wkhn5\") pod \"f36fadf5-d108-4d4b-8985-d9450d8a4368\" (UID: \"f36fadf5-d108-4d4b-8985-d9450d8a4368\") " Oct 02 09:46:37 crc kubenswrapper[4722]: I1002 09:46:37.797156 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f36fadf5-d108-4d4b-8985-d9450d8a4368-utilities" (OuterVolumeSpecName: "utilities") pod "f36fadf5-d108-4d4b-8985-d9450d8a4368" (UID: "f36fadf5-d108-4d4b-8985-d9450d8a4368"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:46:37 crc kubenswrapper[4722]: I1002 09:46:37.804256 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f36fadf5-d108-4d4b-8985-d9450d8a4368-kube-api-access-wkhn5" (OuterVolumeSpecName: "kube-api-access-wkhn5") pod "f36fadf5-d108-4d4b-8985-d9450d8a4368" (UID: "f36fadf5-d108-4d4b-8985-d9450d8a4368"). InnerVolumeSpecName "kube-api-access-wkhn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:46:37 crc kubenswrapper[4722]: I1002 09:46:37.812405 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f36fadf5-d108-4d4b-8985-d9450d8a4368-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f36fadf5-d108-4d4b-8985-d9450d8a4368" (UID: "f36fadf5-d108-4d4b-8985-d9450d8a4368"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:46:37 crc kubenswrapper[4722]: I1002 09:46:37.897940 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f36fadf5-d108-4d4b-8985-d9450d8a4368-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 09:46:37 crc kubenswrapper[4722]: I1002 09:46:37.897998 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkhn5\" (UniqueName: \"kubernetes.io/projected/f36fadf5-d108-4d4b-8985-d9450d8a4368-kube-api-access-wkhn5\") on node \"crc\" DevicePath \"\"" Oct 02 09:46:37 crc kubenswrapper[4722]: I1002 09:46:37.898012 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f36fadf5-d108-4d4b-8985-d9450d8a4368-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 09:46:38 crc kubenswrapper[4722]: I1002 09:46:38.067246 4722 generic.go:334] "Generic (PLEG): container finished" podID="f36fadf5-d108-4d4b-8985-d9450d8a4368" containerID="8771df94fea9be76942b1f83250b6567a01a5fede7940f71ceb76216784c99f3" exitCode=0 Oct 02 09:46:38 crc kubenswrapper[4722]: I1002 09:46:38.067560 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szbtd" event={"ID":"f36fadf5-d108-4d4b-8985-d9450d8a4368","Type":"ContainerDied","Data":"8771df94fea9be76942b1f83250b6567a01a5fede7940f71ceb76216784c99f3"} Oct 02 09:46:38 crc kubenswrapper[4722]: I1002 09:46:38.067610 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szbtd" event={"ID":"f36fadf5-d108-4d4b-8985-d9450d8a4368","Type":"ContainerDied","Data":"c1daa99b8019fcb7d0c7664c22b17f2e47c8cf1e47c025ab3592c5ebb40e6a2a"} Oct 02 09:46:38 crc kubenswrapper[4722]: I1002 09:46:38.067657 4722 scope.go:117] "RemoveContainer" containerID="8771df94fea9be76942b1f83250b6567a01a5fede7940f71ceb76216784c99f3" Oct 02 09:46:38 crc kubenswrapper[4722]: I1002 09:46:38.067664 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-szbtd" Oct 02 09:46:38 crc kubenswrapper[4722]: I1002 09:46:38.089424 4722 scope.go:117] "RemoveContainer" containerID="f2f7a14180c2c1c56e202eb0a26909bfcbe75ff99afe43e2c313e3ab13eacfe8" Oct 02 09:46:38 crc kubenswrapper[4722]: I1002 09:46:38.105594 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-szbtd"] Oct 02 09:46:38 crc kubenswrapper[4722]: I1002 09:46:38.109957 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-szbtd"] Oct 02 09:46:38 crc kubenswrapper[4722]: I1002 09:46:38.128148 4722 scope.go:117] "RemoveContainer" containerID="fd1853aaa1ba884089e453ec1af8a751f2ed846f5db67c6ddd005d22bc455bf5" Oct 02 09:46:38 crc kubenswrapper[4722]: I1002 09:46:38.146510 4722 scope.go:117] "RemoveContainer" containerID="8771df94fea9be76942b1f83250b6567a01a5fede7940f71ceb76216784c99f3" Oct 02 09:46:38 crc kubenswrapper[4722]: E1002 09:46:38.147022 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8771df94fea9be76942b1f83250b6567a01a5fede7940f71ceb76216784c99f3\": container with ID starting with 8771df94fea9be76942b1f83250b6567a01a5fede7940f71ceb76216784c99f3 not found: ID does not exist" containerID="8771df94fea9be76942b1f83250b6567a01a5fede7940f71ceb76216784c99f3" Oct 02 09:46:38 crc kubenswrapper[4722]: I1002 09:46:38.147064 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8771df94fea9be76942b1f83250b6567a01a5fede7940f71ceb76216784c99f3"} err="failed to get container status \"8771df94fea9be76942b1f83250b6567a01a5fede7940f71ceb76216784c99f3\": rpc error: code = NotFound desc = could not find container \"8771df94fea9be76942b1f83250b6567a01a5fede7940f71ceb76216784c99f3\": container with ID starting with 8771df94fea9be76942b1f83250b6567a01a5fede7940f71ceb76216784c99f3 not found: ID does not exist" Oct 02 09:46:38 crc kubenswrapper[4722]: I1002 09:46:38.147100 4722 scope.go:117] "RemoveContainer" containerID="f2f7a14180c2c1c56e202eb0a26909bfcbe75ff99afe43e2c313e3ab13eacfe8" Oct 02 09:46:38 crc kubenswrapper[4722]: E1002 09:46:38.147397 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2f7a14180c2c1c56e202eb0a26909bfcbe75ff99afe43e2c313e3ab13eacfe8\": container with ID starting with f2f7a14180c2c1c56e202eb0a26909bfcbe75ff99afe43e2c313e3ab13eacfe8 not found: ID does not exist" containerID="f2f7a14180c2c1c56e202eb0a26909bfcbe75ff99afe43e2c313e3ab13eacfe8" Oct 02 09:46:38 crc kubenswrapper[4722]: I1002 09:46:38.147435 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2f7a14180c2c1c56e202eb0a26909bfcbe75ff99afe43e2c313e3ab13eacfe8"} err="failed to get container status \"f2f7a14180c2c1c56e202eb0a26909bfcbe75ff99afe43e2c313e3ab13eacfe8\": rpc error: code = NotFound desc = could not find container \"f2f7a14180c2c1c56e202eb0a26909bfcbe75ff99afe43e2c313e3ab13eacfe8\": container with ID starting with f2f7a14180c2c1c56e202eb0a26909bfcbe75ff99afe43e2c313e3ab13eacfe8 not found: ID does not exist" Oct 02 09:46:38 crc kubenswrapper[4722]: I1002 09:46:38.147452 4722 scope.go:117] "RemoveContainer" containerID="fd1853aaa1ba884089e453ec1af8a751f2ed846f5db67c6ddd005d22bc455bf5" Oct 02 09:46:38 crc kubenswrapper[4722]: E1002 09:46:38.148166 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd1853aaa1ba884089e453ec1af8a751f2ed846f5db67c6ddd005d22bc455bf5\": container with ID starting with fd1853aaa1ba884089e453ec1af8a751f2ed846f5db67c6ddd005d22bc455bf5 not found: ID does not exist" containerID="fd1853aaa1ba884089e453ec1af8a751f2ed846f5db67c6ddd005d22bc455bf5" Oct 02 09:46:38 crc kubenswrapper[4722]: I1002 09:46:38.148196 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd1853aaa1ba884089e453ec1af8a751f2ed846f5db67c6ddd005d22bc455bf5"} err="failed to get container status \"fd1853aaa1ba884089e453ec1af8a751f2ed846f5db67c6ddd005d22bc455bf5\": rpc error: code = NotFound desc = could not find container \"fd1853aaa1ba884089e453ec1af8a751f2ed846f5db67c6ddd005d22bc455bf5\": container with ID starting with fd1853aaa1ba884089e453ec1af8a751f2ed846f5db67c6ddd005d22bc455bf5 not found: ID does not exist" Oct 02 09:46:38 crc kubenswrapper[4722]: I1002 09:46:38.719929 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f36fadf5-d108-4d4b-8985-d9450d8a4368" path="/var/lib/kubelet/pods/f36fadf5-d108-4d4b-8985-d9450d8a4368/volumes" Oct 02 09:46:40 crc kubenswrapper[4722]: I1002 09:46:40.346470 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-index-xzsh7" Oct 02 09:46:40 crc kubenswrapper[4722]: I1002 09:46:40.346565 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/mariadb-operator-index-xzsh7" Oct 02 09:46:40 crc kubenswrapper[4722]: I1002 09:46:40.394195 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/mariadb-operator-index-xzsh7" Oct 02 09:46:41 crc kubenswrapper[4722]: I1002 09:46:41.118600 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-index-xzsh7" Oct 02 09:46:43 crc kubenswrapper[4722]: I1002 09:46:43.267333 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7er96qt"] Oct 02 09:46:43 crc kubenswrapper[4722]: E1002 09:46:43.267575 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f36fadf5-d108-4d4b-8985-d9450d8a4368" containerName="registry-server" Oct 02 09:46:43 crc kubenswrapper[4722]: I1002 09:46:43.267586 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f36fadf5-d108-4d4b-8985-d9450d8a4368" containerName="registry-server" Oct 02 09:46:43 crc kubenswrapper[4722]: E1002 09:46:43.267598 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f36fadf5-d108-4d4b-8985-d9450d8a4368" containerName="extract-utilities" Oct 02 09:46:43 crc kubenswrapper[4722]: I1002 09:46:43.267610 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f36fadf5-d108-4d4b-8985-d9450d8a4368" containerName="extract-utilities" Oct 02 09:46:43 crc kubenswrapper[4722]: E1002 09:46:43.267617 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e5d7413-99c0-47f6-9d88-79798e9f6225" containerName="registry-server" Oct 02 09:46:43 crc kubenswrapper[4722]: I1002 09:46:43.267623 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e5d7413-99c0-47f6-9d88-79798e9f6225" containerName="registry-server" Oct 02 09:46:43 crc kubenswrapper[4722]: E1002 09:46:43.267637 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f36fadf5-d108-4d4b-8985-d9450d8a4368" containerName="extract-content" Oct 02 09:46:43 crc kubenswrapper[4722]: I1002 09:46:43.267643 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f36fadf5-d108-4d4b-8985-d9450d8a4368" containerName="extract-content" Oct 02 09:46:43 crc kubenswrapper[4722]: E1002 09:46:43.267655 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e5d7413-99c0-47f6-9d88-79798e9f6225" containerName="extract-utilities" Oct 02 09:46:43 crc kubenswrapper[4722]: I1002 09:46:43.267661 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e5d7413-99c0-47f6-9d88-79798e9f6225" containerName="extract-utilities" Oct 02 09:46:43 crc kubenswrapper[4722]: E1002 09:46:43.267672 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e5d7413-99c0-47f6-9d88-79798e9f6225" containerName="extract-content" Oct 02 09:46:43 crc kubenswrapper[4722]: I1002 09:46:43.267677 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e5d7413-99c0-47f6-9d88-79798e9f6225" containerName="extract-content" Oct 02 09:46:43 crc kubenswrapper[4722]: I1002 09:46:43.267769 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e5d7413-99c0-47f6-9d88-79798e9f6225" containerName="registry-server" Oct 02 09:46:43 crc kubenswrapper[4722]: I1002 09:46:43.267831 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f36fadf5-d108-4d4b-8985-d9450d8a4368" containerName="registry-server" Oct 02 09:46:43 crc kubenswrapper[4722]: I1002 09:46:43.268569 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7er96qt" Oct 02 09:46:43 crc kubenswrapper[4722]: I1002 09:46:43.271270 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-bc9s8" Oct 02 09:46:43 crc kubenswrapper[4722]: I1002 09:46:43.275059 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7er96qt"] Oct 02 09:46:43 crc kubenswrapper[4722]: I1002 09:46:43.377333 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/550a0f2b-ce49-4428-b4c3-79183095ec89-util\") pod \"1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7er96qt\" (UID: \"550a0f2b-ce49-4428-b4c3-79183095ec89\") " pod="openstack-operators/1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7er96qt" Oct 02 09:46:43 crc kubenswrapper[4722]: I1002 09:46:43.377408 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf4k8\" (UniqueName: \"kubernetes.io/projected/550a0f2b-ce49-4428-b4c3-79183095ec89-kube-api-access-rf4k8\") pod \"1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7er96qt\" (UID: \"550a0f2b-ce49-4428-b4c3-79183095ec89\") " pod="openstack-operators/1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7er96qt" Oct 02 09:46:43 crc kubenswrapper[4722]: I1002 09:46:43.377455 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/550a0f2b-ce49-4428-b4c3-79183095ec89-bundle\") pod \"1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7er96qt\" (UID: \"550a0f2b-ce49-4428-b4c3-79183095ec89\") " pod="openstack-operators/1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7er96qt" Oct 02 09:46:43 crc kubenswrapper[4722]: I1002 09:46:43.478826 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/550a0f2b-ce49-4428-b4c3-79183095ec89-bundle\") pod \"1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7er96qt\" (UID: \"550a0f2b-ce49-4428-b4c3-79183095ec89\") " pod="openstack-operators/1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7er96qt" Oct 02 09:46:43 crc kubenswrapper[4722]: I1002 09:46:43.478950 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/550a0f2b-ce49-4428-b4c3-79183095ec89-util\") pod \"1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7er96qt\" (UID: \"550a0f2b-ce49-4428-b4c3-79183095ec89\") " pod="openstack-operators/1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7er96qt" Oct 02 09:46:43 crc kubenswrapper[4722]: I1002 09:46:43.478992 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf4k8\" (UniqueName: \"kubernetes.io/projected/550a0f2b-ce49-4428-b4c3-79183095ec89-kube-api-access-rf4k8\") pod \"1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7er96qt\" (UID: \"550a0f2b-ce49-4428-b4c3-79183095ec89\") " pod="openstack-operators/1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7er96qt" Oct 02 09:46:43 crc kubenswrapper[4722]: I1002 09:46:43.479512 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/550a0f2b-ce49-4428-b4c3-79183095ec89-bundle\") pod \"1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7er96qt\" (UID: \"550a0f2b-ce49-4428-b4c3-79183095ec89\") " pod="openstack-operators/1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7er96qt" Oct 02 09:46:43 crc kubenswrapper[4722]: I1002 09:46:43.479853 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/550a0f2b-ce49-4428-b4c3-79183095ec89-util\") pod \"1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7er96qt\" (UID: \"550a0f2b-ce49-4428-b4c3-79183095ec89\") " pod="openstack-operators/1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7er96qt" Oct 02 09:46:43 crc kubenswrapper[4722]: I1002 09:46:43.501932 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf4k8\" (UniqueName: \"kubernetes.io/projected/550a0f2b-ce49-4428-b4c3-79183095ec89-kube-api-access-rf4k8\") pod \"1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7er96qt\" (UID: \"550a0f2b-ce49-4428-b4c3-79183095ec89\") " pod="openstack-operators/1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7er96qt" Oct 02 09:46:43 crc kubenswrapper[4722]: I1002 09:46:43.583887 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7er96qt" Oct 02 09:46:43 crc kubenswrapper[4722]: I1002 09:46:43.988000 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7er96qt"] Oct 02 09:46:44 crc kubenswrapper[4722]: I1002 09:46:44.106164 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7er96qt" event={"ID":"550a0f2b-ce49-4428-b4c3-79183095ec89","Type":"ContainerStarted","Data":"3bb60d13e32dca82d3baa2aef04fccd91bcb1f9e4288c36bb1cef80ae4652434"} Oct 02 09:46:45 crc kubenswrapper[4722]: I1002 09:46:45.115873 4722 generic.go:334] "Generic (PLEG): container finished" podID="550a0f2b-ce49-4428-b4c3-79183095ec89" containerID="1fd0c703c189cbd400d77c05a621dc7d6e197e2e28b1974a2199babc6acdac2c" exitCode=0 Oct 02 09:46:45 crc kubenswrapper[4722]: I1002 09:46:45.116074 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7er96qt" event={"ID":"550a0f2b-ce49-4428-b4c3-79183095ec89","Type":"ContainerDied","Data":"1fd0c703c189cbd400d77c05a621dc7d6e197e2e28b1974a2199babc6acdac2c"} Oct 02 09:46:46 crc kubenswrapper[4722]: I1002 09:46:46.125161 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7er96qt" event={"ID":"550a0f2b-ce49-4428-b4c3-79183095ec89","Type":"ContainerStarted","Data":"38cce376ec42b295682284805df97e7569723579ba5e9e2202eb6a4d6ef57aa0"} Oct 02 09:46:47 crc kubenswrapper[4722]: I1002 09:46:47.133344 4722 generic.go:334] "Generic (PLEG): container finished" podID="550a0f2b-ce49-4428-b4c3-79183095ec89" containerID="38cce376ec42b295682284805df97e7569723579ba5e9e2202eb6a4d6ef57aa0" exitCode=0 Oct 02 09:46:47 crc kubenswrapper[4722]: I1002 09:46:47.133866 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7er96qt" event={"ID":"550a0f2b-ce49-4428-b4c3-79183095ec89","Type":"ContainerDied","Data":"38cce376ec42b295682284805df97e7569723579ba5e9e2202eb6a4d6ef57aa0"} Oct 02 09:46:48 crc kubenswrapper[4722]: I1002 09:46:48.143609 4722 generic.go:334] "Generic (PLEG): container finished" podID="550a0f2b-ce49-4428-b4c3-79183095ec89" containerID="b2bb7e0973d203eba9bf29734ad5a1b68a82b7b8ffeb405278438dc16e9f6cdb" exitCode=0 Oct 02 09:46:48 crc kubenswrapper[4722]: I1002 09:46:48.143869 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7er96qt" event={"ID":"550a0f2b-ce49-4428-b4c3-79183095ec89","Type":"ContainerDied","Data":"b2bb7e0973d203eba9bf29734ad5a1b68a82b7b8ffeb405278438dc16e9f6cdb"} Oct 02 09:46:49 crc kubenswrapper[4722]: I1002 09:46:49.436738 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7er96qt" Oct 02 09:46:49 crc kubenswrapper[4722]: I1002 09:46:49.569223 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/550a0f2b-ce49-4428-b4c3-79183095ec89-util\") pod \"550a0f2b-ce49-4428-b4c3-79183095ec89\" (UID: \"550a0f2b-ce49-4428-b4c3-79183095ec89\") " Oct 02 09:46:49 crc kubenswrapper[4722]: I1002 09:46:49.569495 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/550a0f2b-ce49-4428-b4c3-79183095ec89-bundle\") pod \"550a0f2b-ce49-4428-b4c3-79183095ec89\" (UID: \"550a0f2b-ce49-4428-b4c3-79183095ec89\") " Oct 02 09:46:49 crc kubenswrapper[4722]: I1002 09:46:49.569626 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rf4k8\" (UniqueName: \"kubernetes.io/projected/550a0f2b-ce49-4428-b4c3-79183095ec89-kube-api-access-rf4k8\") pod \"550a0f2b-ce49-4428-b4c3-79183095ec89\" (UID: \"550a0f2b-ce49-4428-b4c3-79183095ec89\") " Oct 02 09:46:49 crc kubenswrapper[4722]: I1002 09:46:49.570566 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/550a0f2b-ce49-4428-b4c3-79183095ec89-bundle" (OuterVolumeSpecName: "bundle") pod "550a0f2b-ce49-4428-b4c3-79183095ec89" (UID: "550a0f2b-ce49-4428-b4c3-79183095ec89"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:46:49 crc kubenswrapper[4722]: I1002 09:46:49.576952 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/550a0f2b-ce49-4428-b4c3-79183095ec89-kube-api-access-rf4k8" (OuterVolumeSpecName: "kube-api-access-rf4k8") pod "550a0f2b-ce49-4428-b4c3-79183095ec89" (UID: "550a0f2b-ce49-4428-b4c3-79183095ec89"). InnerVolumeSpecName "kube-api-access-rf4k8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:46:49 crc kubenswrapper[4722]: I1002 09:46:49.671568 4722 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/550a0f2b-ce49-4428-b4c3-79183095ec89-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 09:46:49 crc kubenswrapper[4722]: I1002 09:46:49.671617 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rf4k8\" (UniqueName: \"kubernetes.io/projected/550a0f2b-ce49-4428-b4c3-79183095ec89-kube-api-access-rf4k8\") on node \"crc\" DevicePath \"\"" Oct 02 09:46:50 crc kubenswrapper[4722]: I1002 09:46:50.017848 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/550a0f2b-ce49-4428-b4c3-79183095ec89-util" (OuterVolumeSpecName: "util") pod "550a0f2b-ce49-4428-b4c3-79183095ec89" (UID: "550a0f2b-ce49-4428-b4c3-79183095ec89"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:46:50 crc kubenswrapper[4722]: I1002 09:46:50.078849 4722 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/550a0f2b-ce49-4428-b4c3-79183095ec89-util\") on node \"crc\" DevicePath \"\"" Oct 02 09:46:50 crc kubenswrapper[4722]: I1002 09:46:50.159249 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7er96qt" event={"ID":"550a0f2b-ce49-4428-b4c3-79183095ec89","Type":"ContainerDied","Data":"3bb60d13e32dca82d3baa2aef04fccd91bcb1f9e4288c36bb1cef80ae4652434"} Oct 02 09:46:50 crc kubenswrapper[4722]: I1002 09:46:50.159301 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3bb60d13e32dca82d3baa2aef04fccd91bcb1f9e4288c36bb1cef80ae4652434" Oct 02 09:46:50 crc kubenswrapper[4722]: I1002 09:46:50.159424 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7er96qt" Oct 02 09:46:56 crc kubenswrapper[4722]: I1002 09:46:56.602216 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-585c4c46db-hwxpp"] Oct 02 09:46:56 crc kubenswrapper[4722]: E1002 09:46:56.603477 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="550a0f2b-ce49-4428-b4c3-79183095ec89" containerName="util" Oct 02 09:46:56 crc kubenswrapper[4722]: I1002 09:46:56.603501 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="550a0f2b-ce49-4428-b4c3-79183095ec89" containerName="util" Oct 02 09:46:56 crc kubenswrapper[4722]: E1002 09:46:56.603523 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="550a0f2b-ce49-4428-b4c3-79183095ec89" containerName="pull" Oct 02 09:46:56 crc kubenswrapper[4722]: I1002 09:46:56.603533 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="550a0f2b-ce49-4428-b4c3-79183095ec89" containerName="pull" Oct 02 09:46:56 crc kubenswrapper[4722]: E1002 09:46:56.603565 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="550a0f2b-ce49-4428-b4c3-79183095ec89" containerName="extract" Oct 02 09:46:56 crc kubenswrapper[4722]: I1002 09:46:56.603576 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="550a0f2b-ce49-4428-b4c3-79183095ec89" containerName="extract" Oct 02 09:46:56 crc kubenswrapper[4722]: I1002 09:46:56.603717 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="550a0f2b-ce49-4428-b4c3-79183095ec89" containerName="extract" Oct 02 09:46:56 crc kubenswrapper[4722]: I1002 09:46:56.604659 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-585c4c46db-hwxpp" Oct 02 09:46:56 crc kubenswrapper[4722]: I1002 09:46:56.607112 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 02 09:46:56 crc kubenswrapper[4722]: I1002 09:46:56.607400 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-rh9sl" Oct 02 09:46:56 crc kubenswrapper[4722]: I1002 09:46:56.608578 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-service-cert" Oct 02 09:46:56 crc kubenswrapper[4722]: I1002 09:46:56.615891 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-585c4c46db-hwxpp"] Oct 02 09:46:56 crc kubenswrapper[4722]: I1002 09:46:56.792115 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/afacf224-9f10-4ab3-8044-f069cd3e8445-apiservice-cert\") pod \"mariadb-operator-controller-manager-585c4c46db-hwxpp\" (UID: \"afacf224-9f10-4ab3-8044-f069cd3e8445\") " pod="openstack-operators/mariadb-operator-controller-manager-585c4c46db-hwxpp" Oct 02 09:46:56 crc kubenswrapper[4722]: I1002 09:46:56.792191 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/afacf224-9f10-4ab3-8044-f069cd3e8445-webhook-cert\") pod \"mariadb-operator-controller-manager-585c4c46db-hwxpp\" (UID: \"afacf224-9f10-4ab3-8044-f069cd3e8445\") " pod="openstack-operators/mariadb-operator-controller-manager-585c4c46db-hwxpp" Oct 02 09:46:56 crc kubenswrapper[4722]: I1002 09:46:56.792222 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwkj4\" (UniqueName: \"kubernetes.io/projected/afacf224-9f10-4ab3-8044-f069cd3e8445-kube-api-access-nwkj4\") pod \"mariadb-operator-controller-manager-585c4c46db-hwxpp\" (UID: \"afacf224-9f10-4ab3-8044-f069cd3e8445\") " pod="openstack-operators/mariadb-operator-controller-manager-585c4c46db-hwxpp" Oct 02 09:46:56 crc kubenswrapper[4722]: I1002 09:46:56.893265 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwkj4\" (UniqueName: \"kubernetes.io/projected/afacf224-9f10-4ab3-8044-f069cd3e8445-kube-api-access-nwkj4\") pod \"mariadb-operator-controller-manager-585c4c46db-hwxpp\" (UID: \"afacf224-9f10-4ab3-8044-f069cd3e8445\") " pod="openstack-operators/mariadb-operator-controller-manager-585c4c46db-hwxpp" Oct 02 09:46:56 crc kubenswrapper[4722]: I1002 09:46:56.893408 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/afacf224-9f10-4ab3-8044-f069cd3e8445-apiservice-cert\") pod \"mariadb-operator-controller-manager-585c4c46db-hwxpp\" (UID: \"afacf224-9f10-4ab3-8044-f069cd3e8445\") " pod="openstack-operators/mariadb-operator-controller-manager-585c4c46db-hwxpp" Oct 02 09:46:56 crc kubenswrapper[4722]: I1002 09:46:56.893436 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/afacf224-9f10-4ab3-8044-f069cd3e8445-webhook-cert\") pod \"mariadb-operator-controller-manager-585c4c46db-hwxpp\" (UID: \"afacf224-9f10-4ab3-8044-f069cd3e8445\") " pod="openstack-operators/mariadb-operator-controller-manager-585c4c46db-hwxpp" Oct 02 09:46:56 crc kubenswrapper[4722]: I1002 09:46:56.902105 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/afacf224-9f10-4ab3-8044-f069cd3e8445-webhook-cert\") pod \"mariadb-operator-controller-manager-585c4c46db-hwxpp\" (UID: \"afacf224-9f10-4ab3-8044-f069cd3e8445\") " pod="openstack-operators/mariadb-operator-controller-manager-585c4c46db-hwxpp" Oct 02 09:46:56 crc kubenswrapper[4722]: I1002 09:46:56.903983 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/afacf224-9f10-4ab3-8044-f069cd3e8445-apiservice-cert\") pod \"mariadb-operator-controller-manager-585c4c46db-hwxpp\" (UID: \"afacf224-9f10-4ab3-8044-f069cd3e8445\") " pod="openstack-operators/mariadb-operator-controller-manager-585c4c46db-hwxpp" Oct 02 09:46:56 crc kubenswrapper[4722]: I1002 09:46:56.923687 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwkj4\" (UniqueName: \"kubernetes.io/projected/afacf224-9f10-4ab3-8044-f069cd3e8445-kube-api-access-nwkj4\") pod \"mariadb-operator-controller-manager-585c4c46db-hwxpp\" (UID: \"afacf224-9f10-4ab3-8044-f069cd3e8445\") " pod="openstack-operators/mariadb-operator-controller-manager-585c4c46db-hwxpp" Oct 02 09:46:56 crc kubenswrapper[4722]: I1002 09:46:56.933663 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-585c4c46db-hwxpp" Oct 02 09:46:57 crc kubenswrapper[4722]: I1002 09:46:57.272736 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-585c4c46db-hwxpp"] Oct 02 09:46:58 crc kubenswrapper[4722]: I1002 09:46:58.236485 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-585c4c46db-hwxpp" event={"ID":"afacf224-9f10-4ab3-8044-f069cd3e8445","Type":"ContainerStarted","Data":"aa34f5af3cab89b490a8f2bb8a11a0ea565cdca73c1534988edf87fd77b94937"} Oct 02 09:47:02 crc kubenswrapper[4722]: I1002 09:47:02.258360 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-585c4c46db-hwxpp" event={"ID":"afacf224-9f10-4ab3-8044-f069cd3e8445","Type":"ContainerStarted","Data":"b598e9b8853344c23fb4cf765a38797ccbdeb8edfce92266a9bad5c2bd5b1ecd"} Oct 02 09:47:14 crc kubenswrapper[4722]: I1002 09:47:14.327402 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-585c4c46db-hwxpp" event={"ID":"afacf224-9f10-4ab3-8044-f069cd3e8445","Type":"ContainerStarted","Data":"366016eee61cb7e974c42abc87061a25332351865ca9fd0f129cfde1f04b0520"} Oct 02 09:47:14 crc kubenswrapper[4722]: I1002 09:47:14.328198 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-585c4c46db-hwxpp" Oct 02 09:47:14 crc kubenswrapper[4722]: I1002 09:47:14.333189 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-585c4c46db-hwxpp" Oct 02 09:47:14 crc kubenswrapper[4722]: I1002 09:47:14.348187 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-585c4c46db-hwxpp" podStartSLOduration=2.465449675 podStartE2EDuration="18.348168347s" podCreationTimestamp="2025-10-02 09:46:56 +0000 UTC" firstStartedPulling="2025-10-02 09:46:57.279970647 +0000 UTC m=+873.316723617" lastFinishedPulling="2025-10-02 09:47:13.162689309 +0000 UTC m=+889.199442289" observedRunningTime="2025-10-02 09:47:14.347742646 +0000 UTC m=+890.384495656" watchObservedRunningTime="2025-10-02 09:47:14.348168347 +0000 UTC m=+890.384921327" Oct 02 09:47:15 crc kubenswrapper[4722]: I1002 09:47:15.516475 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-6xnxr"] Oct 02 09:47:15 crc kubenswrapper[4722]: I1002 09:47:15.517277 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-6xnxr" Oct 02 09:47:15 crc kubenswrapper[4722]: I1002 09:47:15.523167 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-index-dockercfg-9xbqk" Oct 02 09:47:15 crc kubenswrapper[4722]: I1002 09:47:15.598530 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-6xnxr"] Oct 02 09:47:15 crc kubenswrapper[4722]: I1002 09:47:15.663465 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg4ld\" (UniqueName: \"kubernetes.io/projected/5e693b9f-5222-4ab6-a8f0-060116630808-kube-api-access-xg4ld\") pod \"infra-operator-index-6xnxr\" (UID: \"5e693b9f-5222-4ab6-a8f0-060116630808\") " pod="openstack-operators/infra-operator-index-6xnxr" Oct 02 09:47:15 crc kubenswrapper[4722]: I1002 09:47:15.764856 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg4ld\" (UniqueName: \"kubernetes.io/projected/5e693b9f-5222-4ab6-a8f0-060116630808-kube-api-access-xg4ld\") pod \"infra-operator-index-6xnxr\" (UID: \"5e693b9f-5222-4ab6-a8f0-060116630808\") " pod="openstack-operators/infra-operator-index-6xnxr" Oct 02 09:47:15 crc kubenswrapper[4722]: I1002 09:47:15.791738 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg4ld\" (UniqueName: \"kubernetes.io/projected/5e693b9f-5222-4ab6-a8f0-060116630808-kube-api-access-xg4ld\") pod \"infra-operator-index-6xnxr\" (UID: \"5e693b9f-5222-4ab6-a8f0-060116630808\") " pod="openstack-operators/infra-operator-index-6xnxr" Oct 02 09:47:15 crc kubenswrapper[4722]: I1002 09:47:15.836216 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-6xnxr" Oct 02 09:47:16 crc kubenswrapper[4722]: I1002 09:47:16.107203 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-6xnxr"] Oct 02 09:47:16 crc kubenswrapper[4722]: W1002 09:47:16.127719 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e693b9f_5222_4ab6_a8f0_060116630808.slice/crio-068dc66a9ebbb94483c36f85609c947ffacf881c81ccc78da60ac9d6c0f0224f WatchSource:0}: Error finding container 068dc66a9ebbb94483c36f85609c947ffacf881c81ccc78da60ac9d6c0f0224f: Status 404 returned error can't find the container with id 068dc66a9ebbb94483c36f85609c947ffacf881c81ccc78da60ac9d6c0f0224f Oct 02 09:47:16 crc kubenswrapper[4722]: I1002 09:47:16.360824 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-6xnxr" event={"ID":"5e693b9f-5222-4ab6-a8f0-060116630808","Type":"ContainerStarted","Data":"068dc66a9ebbb94483c36f85609c947ffacf881c81ccc78da60ac9d6c0f0224f"} Oct 02 09:47:17 crc kubenswrapper[4722]: I1002 09:47:17.369150 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-6xnxr" event={"ID":"5e693b9f-5222-4ab6-a8f0-060116630808","Type":"ContainerStarted","Data":"941d21e2bf3c67605be3157d5d07f5d2fc32277ebef8cbcc9585561083dbd618"} Oct 02 09:47:17 crc kubenswrapper[4722]: I1002 09:47:17.390084 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-6xnxr" podStartSLOduration=1.541651126 podStartE2EDuration="2.39006002s" podCreationTimestamp="2025-10-02 09:47:15 +0000 UTC" firstStartedPulling="2025-10-02 09:47:16.129984999 +0000 UTC m=+892.166737979" lastFinishedPulling="2025-10-02 09:47:16.978393893 +0000 UTC m=+893.015146873" observedRunningTime="2025-10-02 09:47:17.389589098 +0000 UTC m=+893.426342098" watchObservedRunningTime="2025-10-02 09:47:17.39006002 +0000 UTC m=+893.426813000" Oct 02 09:47:18 crc kubenswrapper[4722]: I1002 09:47:18.266056 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-6xnxr"] Oct 02 09:47:18 crc kubenswrapper[4722]: I1002 09:47:18.870923 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-gpr2x"] Oct 02 09:47:18 crc kubenswrapper[4722]: I1002 09:47:18.871719 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-gpr2x" Oct 02 09:47:18 crc kubenswrapper[4722]: I1002 09:47:18.884632 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-gpr2x"] Oct 02 09:47:18 crc kubenswrapper[4722]: I1002 09:47:18.915377 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-855cv\" (UniqueName: \"kubernetes.io/projected/f25cb989-7ca5-4606-bad2-9a63992d027c-kube-api-access-855cv\") pod \"infra-operator-index-gpr2x\" (UID: \"f25cb989-7ca5-4606-bad2-9a63992d027c\") " pod="openstack-operators/infra-operator-index-gpr2x" Oct 02 09:47:19 crc kubenswrapper[4722]: I1002 09:47:19.017252 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-855cv\" (UniqueName: \"kubernetes.io/projected/f25cb989-7ca5-4606-bad2-9a63992d027c-kube-api-access-855cv\") pod \"infra-operator-index-gpr2x\" (UID: \"f25cb989-7ca5-4606-bad2-9a63992d027c\") " pod="openstack-operators/infra-operator-index-gpr2x" Oct 02 09:47:19 crc kubenswrapper[4722]: I1002 09:47:19.047110 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-855cv\" (UniqueName: \"kubernetes.io/projected/f25cb989-7ca5-4606-bad2-9a63992d027c-kube-api-access-855cv\") pod \"infra-operator-index-gpr2x\" (UID: \"f25cb989-7ca5-4606-bad2-9a63992d027c\") " pod="openstack-operators/infra-operator-index-gpr2x" Oct 02 09:47:19 crc kubenswrapper[4722]: I1002 09:47:19.189736 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-gpr2x" Oct 02 09:47:19 crc kubenswrapper[4722]: I1002 09:47:19.383105 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-index-6xnxr" podUID="5e693b9f-5222-4ab6-a8f0-060116630808" containerName="registry-server" containerID="cri-o://941d21e2bf3c67605be3157d5d07f5d2fc32277ebef8cbcc9585561083dbd618" gracePeriod=2 Oct 02 09:47:19 crc kubenswrapper[4722]: I1002 09:47:19.590446 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-gpr2x"] Oct 02 09:47:19 crc kubenswrapper[4722]: W1002 09:47:19.599766 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf25cb989_7ca5_4606_bad2_9a63992d027c.slice/crio-d9fe3b3c90312acca2c0be6f0e2f3d14617d7da392e5d55bbad2e1a15fb0ab07 WatchSource:0}: Error finding container d9fe3b3c90312acca2c0be6f0e2f3d14617d7da392e5d55bbad2e1a15fb0ab07: Status 404 returned error can't find the container with id d9fe3b3c90312acca2c0be6f0e2f3d14617d7da392e5d55bbad2e1a15fb0ab07 Oct 02 09:47:19 crc kubenswrapper[4722]: I1002 09:47:19.700388 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-6xnxr" Oct 02 09:47:19 crc kubenswrapper[4722]: I1002 09:47:19.828833 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xg4ld\" (UniqueName: \"kubernetes.io/projected/5e693b9f-5222-4ab6-a8f0-060116630808-kube-api-access-xg4ld\") pod \"5e693b9f-5222-4ab6-a8f0-060116630808\" (UID: \"5e693b9f-5222-4ab6-a8f0-060116630808\") " Oct 02 09:47:19 crc kubenswrapper[4722]: I1002 09:47:19.832635 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e693b9f-5222-4ab6-a8f0-060116630808-kube-api-access-xg4ld" (OuterVolumeSpecName: "kube-api-access-xg4ld") pod "5e693b9f-5222-4ab6-a8f0-060116630808" (UID: "5e693b9f-5222-4ab6-a8f0-060116630808"). InnerVolumeSpecName "kube-api-access-xg4ld". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:47:19 crc kubenswrapper[4722]: I1002 09:47:19.930517 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xg4ld\" (UniqueName: \"kubernetes.io/projected/5e693b9f-5222-4ab6-a8f0-060116630808-kube-api-access-xg4ld\") on node \"crc\" DevicePath \"\"" Oct 02 09:47:20 crc kubenswrapper[4722]: I1002 09:47:20.390210 4722 generic.go:334] "Generic (PLEG): container finished" podID="5e693b9f-5222-4ab6-a8f0-060116630808" containerID="941d21e2bf3c67605be3157d5d07f5d2fc32277ebef8cbcc9585561083dbd618" exitCode=0 Oct 02 09:47:20 crc kubenswrapper[4722]: I1002 09:47:20.390285 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-6xnxr" Oct 02 09:47:20 crc kubenswrapper[4722]: I1002 09:47:20.390401 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-6xnxr" event={"ID":"5e693b9f-5222-4ab6-a8f0-060116630808","Type":"ContainerDied","Data":"941d21e2bf3c67605be3157d5d07f5d2fc32277ebef8cbcc9585561083dbd618"} Oct 02 09:47:20 crc kubenswrapper[4722]: I1002 09:47:20.390832 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-6xnxr" event={"ID":"5e693b9f-5222-4ab6-a8f0-060116630808","Type":"ContainerDied","Data":"068dc66a9ebbb94483c36f85609c947ffacf881c81ccc78da60ac9d6c0f0224f"} Oct 02 09:47:20 crc kubenswrapper[4722]: I1002 09:47:20.390866 4722 scope.go:117] "RemoveContainer" containerID="941d21e2bf3c67605be3157d5d07f5d2fc32277ebef8cbcc9585561083dbd618" Oct 02 09:47:20 crc kubenswrapper[4722]: I1002 09:47:20.392921 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-gpr2x" event={"ID":"f25cb989-7ca5-4606-bad2-9a63992d027c","Type":"ContainerStarted","Data":"b40189f913e11628f3bee3ffb7aa34a650424f672b04dab2440e1c8d7061951b"} Oct 02 09:47:20 crc kubenswrapper[4722]: I1002 09:47:20.392958 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-gpr2x" event={"ID":"f25cb989-7ca5-4606-bad2-9a63992d027c","Type":"ContainerStarted","Data":"d9fe3b3c90312acca2c0be6f0e2f3d14617d7da392e5d55bbad2e1a15fb0ab07"} Oct 02 09:47:20 crc kubenswrapper[4722]: I1002 09:47:20.405181 4722 scope.go:117] "RemoveContainer" containerID="941d21e2bf3c67605be3157d5d07f5d2fc32277ebef8cbcc9585561083dbd618" Oct 02 09:47:20 crc kubenswrapper[4722]: E1002 09:47:20.405634 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"941d21e2bf3c67605be3157d5d07f5d2fc32277ebef8cbcc9585561083dbd618\": container with ID starting with 941d21e2bf3c67605be3157d5d07f5d2fc32277ebef8cbcc9585561083dbd618 not found: ID does not exist" containerID="941d21e2bf3c67605be3157d5d07f5d2fc32277ebef8cbcc9585561083dbd618" Oct 02 09:47:20 crc kubenswrapper[4722]: I1002 09:47:20.405683 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"941d21e2bf3c67605be3157d5d07f5d2fc32277ebef8cbcc9585561083dbd618"} err="failed to get container status \"941d21e2bf3c67605be3157d5d07f5d2fc32277ebef8cbcc9585561083dbd618\": rpc error: code = NotFound desc = could not find container \"941d21e2bf3c67605be3157d5d07f5d2fc32277ebef8cbcc9585561083dbd618\": container with ID starting with 941d21e2bf3c67605be3157d5d07f5d2fc32277ebef8cbcc9585561083dbd618 not found: ID does not exist" Oct 02 09:47:20 crc kubenswrapper[4722]: I1002 09:47:20.431095 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-gpr2x" podStartSLOduration=1.897071634 podStartE2EDuration="2.43106531s" podCreationTimestamp="2025-10-02 09:47:18 +0000 UTC" firstStartedPulling="2025-10-02 09:47:19.605242001 +0000 UTC m=+895.641994981" lastFinishedPulling="2025-10-02 09:47:20.139235677 +0000 UTC m=+896.175988657" observedRunningTime="2025-10-02 09:47:20.415161617 +0000 UTC m=+896.451914607" watchObservedRunningTime="2025-10-02 09:47:20.43106531 +0000 UTC m=+896.467818300" Oct 02 09:47:20 crc kubenswrapper[4722]: I1002 09:47:20.442384 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-6xnxr"] Oct 02 09:47:20 crc kubenswrapper[4722]: I1002 09:47:20.448040 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-index-6xnxr"] Oct 02 09:47:20 crc kubenswrapper[4722]: I1002 09:47:20.729343 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e693b9f-5222-4ab6-a8f0-060116630808" path="/var/lib/kubelet/pods/5e693b9f-5222-4ab6-a8f0-060116630808/volumes" Oct 02 09:47:29 crc kubenswrapper[4722]: I1002 09:47:29.190696 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/infra-operator-index-gpr2x" Oct 02 09:47:29 crc kubenswrapper[4722]: I1002 09:47:29.191231 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-index-gpr2x" Oct 02 09:47:29 crc kubenswrapper[4722]: I1002 09:47:29.216432 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/infra-operator-index-gpr2x" Oct 02 09:47:29 crc kubenswrapper[4722]: I1002 09:47:29.476883 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-index-gpr2x" Oct 02 09:47:30 crc kubenswrapper[4722]: I1002 09:47:30.708087 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/dc342c58da44971d22567f4408cd0f1817e3b92eeb179ae23b9e78aba3q2nsk"] Oct 02 09:47:30 crc kubenswrapper[4722]: E1002 09:47:30.708368 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e693b9f-5222-4ab6-a8f0-060116630808" containerName="registry-server" Oct 02 09:47:30 crc kubenswrapper[4722]: I1002 09:47:30.708380 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e693b9f-5222-4ab6-a8f0-060116630808" containerName="registry-server" Oct 02 09:47:30 crc kubenswrapper[4722]: I1002 09:47:30.708490 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e693b9f-5222-4ab6-a8f0-060116630808" containerName="registry-server" Oct 02 09:47:30 crc kubenswrapper[4722]: I1002 09:47:30.709556 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/dc342c58da44971d22567f4408cd0f1817e3b92eeb179ae23b9e78aba3q2nsk" Oct 02 09:47:30 crc kubenswrapper[4722]: I1002 09:47:30.714465 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-bc9s8" Oct 02 09:47:30 crc kubenswrapper[4722]: I1002 09:47:30.720374 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/dc342c58da44971d22567f4408cd0f1817e3b92eeb179ae23b9e78aba3q2nsk"] Oct 02 09:47:30 crc kubenswrapper[4722]: I1002 09:47:30.886102 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e-bundle\") pod \"dc342c58da44971d22567f4408cd0f1817e3b92eeb179ae23b9e78aba3q2nsk\" (UID: \"da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e\") " pod="openstack-operators/dc342c58da44971d22567f4408cd0f1817e3b92eeb179ae23b9e78aba3q2nsk" Oct 02 09:47:30 crc kubenswrapper[4722]: I1002 09:47:30.886270 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e-util\") pod \"dc342c58da44971d22567f4408cd0f1817e3b92eeb179ae23b9e78aba3q2nsk\" (UID: \"da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e\") " pod="openstack-operators/dc342c58da44971d22567f4408cd0f1817e3b92eeb179ae23b9e78aba3q2nsk" Oct 02 09:47:30 crc kubenswrapper[4722]: I1002 09:47:30.886428 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q7f6\" (UniqueName: \"kubernetes.io/projected/da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e-kube-api-access-2q7f6\") pod \"dc342c58da44971d22567f4408cd0f1817e3b92eeb179ae23b9e78aba3q2nsk\" (UID: \"da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e\") " pod="openstack-operators/dc342c58da44971d22567f4408cd0f1817e3b92eeb179ae23b9e78aba3q2nsk" Oct 02 09:47:30 crc kubenswrapper[4722]: I1002 09:47:30.988005 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e-util\") pod \"dc342c58da44971d22567f4408cd0f1817e3b92eeb179ae23b9e78aba3q2nsk\" (UID: \"da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e\") " pod="openstack-operators/dc342c58da44971d22567f4408cd0f1817e3b92eeb179ae23b9e78aba3q2nsk" Oct 02 09:47:30 crc kubenswrapper[4722]: I1002 09:47:30.988078 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q7f6\" (UniqueName: \"kubernetes.io/projected/da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e-kube-api-access-2q7f6\") pod \"dc342c58da44971d22567f4408cd0f1817e3b92eeb179ae23b9e78aba3q2nsk\" (UID: \"da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e\") " pod="openstack-operators/dc342c58da44971d22567f4408cd0f1817e3b92eeb179ae23b9e78aba3q2nsk" Oct 02 09:47:30 crc kubenswrapper[4722]: I1002 09:47:30.988121 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e-bundle\") pod \"dc342c58da44971d22567f4408cd0f1817e3b92eeb179ae23b9e78aba3q2nsk\" (UID: \"da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e\") " pod="openstack-operators/dc342c58da44971d22567f4408cd0f1817e3b92eeb179ae23b9e78aba3q2nsk" Oct 02 09:47:30 crc kubenswrapper[4722]: I1002 09:47:30.988570 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e-util\") pod \"dc342c58da44971d22567f4408cd0f1817e3b92eeb179ae23b9e78aba3q2nsk\" (UID: \"da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e\") " pod="openstack-operators/dc342c58da44971d22567f4408cd0f1817e3b92eeb179ae23b9e78aba3q2nsk" Oct 02 09:47:30 crc kubenswrapper[4722]: I1002 09:47:30.988619 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e-bundle\") pod \"dc342c58da44971d22567f4408cd0f1817e3b92eeb179ae23b9e78aba3q2nsk\" (UID: \"da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e\") " pod="openstack-operators/dc342c58da44971d22567f4408cd0f1817e3b92eeb179ae23b9e78aba3q2nsk" Oct 02 09:47:31 crc kubenswrapper[4722]: I1002 09:47:31.007666 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q7f6\" (UniqueName: \"kubernetes.io/projected/da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e-kube-api-access-2q7f6\") pod \"dc342c58da44971d22567f4408cd0f1817e3b92eeb179ae23b9e78aba3q2nsk\" (UID: \"da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e\") " pod="openstack-operators/dc342c58da44971d22567f4408cd0f1817e3b92eeb179ae23b9e78aba3q2nsk" Oct 02 09:47:31 crc kubenswrapper[4722]: I1002 09:47:31.037481 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/dc342c58da44971d22567f4408cd0f1817e3b92eeb179ae23b9e78aba3q2nsk" Oct 02 09:47:31 crc kubenswrapper[4722]: I1002 09:47:31.223753 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/dc342c58da44971d22567f4408cd0f1817e3b92eeb179ae23b9e78aba3q2nsk"] Oct 02 09:47:31 crc kubenswrapper[4722]: I1002 09:47:31.474219 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dc342c58da44971d22567f4408cd0f1817e3b92eeb179ae23b9e78aba3q2nsk" event={"ID":"da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e","Type":"ContainerDied","Data":"b14120a39461b996163f77b4b4ed0c74bf772aa3d673d899cbb7032d984b076c"} Oct 02 09:47:31 crc kubenswrapper[4722]: I1002 09:47:31.479448 4722 generic.go:334] "Generic (PLEG): container finished" podID="da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e" containerID="b14120a39461b996163f77b4b4ed0c74bf772aa3d673d899cbb7032d984b076c" exitCode=0 Oct 02 09:47:31 crc kubenswrapper[4722]: I1002 09:47:31.479589 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dc342c58da44971d22567f4408cd0f1817e3b92eeb179ae23b9e78aba3q2nsk" event={"ID":"da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e","Type":"ContainerStarted","Data":"b81e526d894a4b6395a7f64284d20ba5585a010eaa2b86ef9d8edd42f2f3b305"} Oct 02 09:47:32 crc kubenswrapper[4722]: I1002 09:47:32.489145 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dc342c58da44971d22567f4408cd0f1817e3b92eeb179ae23b9e78aba3q2nsk" event={"ID":"da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e","Type":"ContainerStarted","Data":"3563eca56b33edd989039c37a68c6da54c9c98a40cd0994f4339847b4426b7be"} Oct 02 09:47:33 crc kubenswrapper[4722]: I1002 09:47:33.498402 4722 generic.go:334] "Generic (PLEG): container finished" podID="da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e" containerID="3563eca56b33edd989039c37a68c6da54c9c98a40cd0994f4339847b4426b7be" exitCode=0 Oct 02 09:47:33 crc kubenswrapper[4722]: I1002 09:47:33.498499 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dc342c58da44971d22567f4408cd0f1817e3b92eeb179ae23b9e78aba3q2nsk" event={"ID":"da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e","Type":"ContainerDied","Data":"3563eca56b33edd989039c37a68c6da54c9c98a40cd0994f4339847b4426b7be"} Oct 02 09:47:34 crc kubenswrapper[4722]: I1002 09:47:34.508693 4722 generic.go:334] "Generic (PLEG): container finished" podID="da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e" containerID="d3b357163a5285eb305a6bc4e9bdd9c06651e5234fe897816b25d2186225f903" exitCode=0 Oct 02 09:47:34 crc kubenswrapper[4722]: I1002 09:47:34.508744 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dc342c58da44971d22567f4408cd0f1817e3b92eeb179ae23b9e78aba3q2nsk" event={"ID":"da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e","Type":"ContainerDied","Data":"d3b357163a5285eb305a6bc4e9bdd9c06651e5234fe897816b25d2186225f903"} Oct 02 09:47:35 crc kubenswrapper[4722]: I1002 09:47:35.799066 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/dc342c58da44971d22567f4408cd0f1817e3b92eeb179ae23b9e78aba3q2nsk" Oct 02 09:47:35 crc kubenswrapper[4722]: I1002 09:47:35.962066 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2q7f6\" (UniqueName: \"kubernetes.io/projected/da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e-kube-api-access-2q7f6\") pod \"da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e\" (UID: \"da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e\") " Oct 02 09:47:35 crc kubenswrapper[4722]: I1002 09:47:35.962149 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e-util\") pod \"da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e\" (UID: \"da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e\") " Oct 02 09:47:35 crc kubenswrapper[4722]: I1002 09:47:35.962185 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e-bundle\") pod \"da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e\" (UID: \"da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e\") " Oct 02 09:47:35 crc kubenswrapper[4722]: I1002 09:47:35.963447 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e-bundle" (OuterVolumeSpecName: "bundle") pod "da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e" (UID: "da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:47:35 crc kubenswrapper[4722]: I1002 09:47:35.968190 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e-kube-api-access-2q7f6" (OuterVolumeSpecName: "kube-api-access-2q7f6") pod "da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e" (UID: "da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e"). InnerVolumeSpecName "kube-api-access-2q7f6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:47:35 crc kubenswrapper[4722]: I1002 09:47:35.978819 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e-util" (OuterVolumeSpecName: "util") pod "da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e" (UID: "da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:47:36 crc kubenswrapper[4722]: I1002 09:47:36.063189 4722 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 09:47:36 crc kubenswrapper[4722]: I1002 09:47:36.063267 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2q7f6\" (UniqueName: \"kubernetes.io/projected/da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e-kube-api-access-2q7f6\") on node \"crc\" DevicePath \"\"" Oct 02 09:47:36 crc kubenswrapper[4722]: I1002 09:47:36.063277 4722 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e-util\") on node \"crc\" DevicePath \"\"" Oct 02 09:47:36 crc kubenswrapper[4722]: I1002 09:47:36.524242 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dc342c58da44971d22567f4408cd0f1817e3b92eeb179ae23b9e78aba3q2nsk" event={"ID":"da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e","Type":"ContainerDied","Data":"b81e526d894a4b6395a7f64284d20ba5585a010eaa2b86ef9d8edd42f2f3b305"} Oct 02 09:47:36 crc kubenswrapper[4722]: I1002 09:47:36.524284 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/dc342c58da44971d22567f4408cd0f1817e3b92eeb179ae23b9e78aba3q2nsk" Oct 02 09:47:36 crc kubenswrapper[4722]: I1002 09:47:36.524292 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b81e526d894a4b6395a7f64284d20ba5585a010eaa2b86ef9d8edd42f2f3b305" Oct 02 09:47:58 crc kubenswrapper[4722]: I1002 09:47:58.360741 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-59886d58dd-rdcd7"] Oct 02 09:47:58 crc kubenswrapper[4722]: E1002 09:47:58.361923 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e" containerName="util" Oct 02 09:47:58 crc kubenswrapper[4722]: I1002 09:47:58.361961 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e" containerName="util" Oct 02 09:47:58 crc kubenswrapper[4722]: E1002 09:47:58.361985 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e" containerName="pull" Oct 02 09:47:58 crc kubenswrapper[4722]: I1002 09:47:58.361991 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e" containerName="pull" Oct 02 09:47:58 crc kubenswrapper[4722]: E1002 09:47:58.362002 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e" containerName="extract" Oct 02 09:47:58 crc kubenswrapper[4722]: I1002 09:47:58.362011 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e" containerName="extract" Oct 02 09:47:58 crc kubenswrapper[4722]: I1002 09:47:58.362171 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e" containerName="extract" Oct 02 09:47:58 crc kubenswrapper[4722]: I1002 09:47:58.363037 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-59886d58dd-rdcd7" Oct 02 09:47:58 crc kubenswrapper[4722]: I1002 09:47:58.365701 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-service-cert" Oct 02 09:47:58 crc kubenswrapper[4722]: I1002 09:47:58.365896 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-9lcfh" Oct 02 09:47:58 crc kubenswrapper[4722]: I1002 09:47:58.374837 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-59886d58dd-rdcd7"] Oct 02 09:47:58 crc kubenswrapper[4722]: I1002 09:47:58.469006 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e-apiservice-cert\") pod \"infra-operator-controller-manager-59886d58dd-rdcd7\" (UID: \"24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e\") " pod="openstack-operators/infra-operator-controller-manager-59886d58dd-rdcd7" Oct 02 09:47:58 crc kubenswrapper[4722]: I1002 09:47:58.469096 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqfxh\" (UniqueName: \"kubernetes.io/projected/24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e-kube-api-access-bqfxh\") pod \"infra-operator-controller-manager-59886d58dd-rdcd7\" (UID: \"24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e\") " pod="openstack-operators/infra-operator-controller-manager-59886d58dd-rdcd7" Oct 02 09:47:58 crc kubenswrapper[4722]: I1002 09:47:58.469134 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e-webhook-cert\") pod \"infra-operator-controller-manager-59886d58dd-rdcd7\" (UID: \"24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e\") " pod="openstack-operators/infra-operator-controller-manager-59886d58dd-rdcd7" Oct 02 09:47:58 crc kubenswrapper[4722]: I1002 09:47:58.571014 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqfxh\" (UniqueName: \"kubernetes.io/projected/24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e-kube-api-access-bqfxh\") pod \"infra-operator-controller-manager-59886d58dd-rdcd7\" (UID: \"24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e\") " pod="openstack-operators/infra-operator-controller-manager-59886d58dd-rdcd7" Oct 02 09:47:58 crc kubenswrapper[4722]: I1002 09:47:58.571091 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e-webhook-cert\") pod \"infra-operator-controller-manager-59886d58dd-rdcd7\" (UID: \"24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e\") " pod="openstack-operators/infra-operator-controller-manager-59886d58dd-rdcd7" Oct 02 09:47:58 crc kubenswrapper[4722]: I1002 09:47:58.571164 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e-apiservice-cert\") pod \"infra-operator-controller-manager-59886d58dd-rdcd7\" (UID: \"24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e\") " pod="openstack-operators/infra-operator-controller-manager-59886d58dd-rdcd7" Oct 02 09:47:58 crc kubenswrapper[4722]: I1002 09:47:58.578564 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e-webhook-cert\") pod \"infra-operator-controller-manager-59886d58dd-rdcd7\" (UID: \"24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e\") " pod="openstack-operators/infra-operator-controller-manager-59886d58dd-rdcd7" Oct 02 09:47:58 crc kubenswrapper[4722]: I1002 09:47:58.578564 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e-apiservice-cert\") pod \"infra-operator-controller-manager-59886d58dd-rdcd7\" (UID: \"24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e\") " pod="openstack-operators/infra-operator-controller-manager-59886d58dd-rdcd7" Oct 02 09:47:58 crc kubenswrapper[4722]: I1002 09:47:58.595679 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqfxh\" (UniqueName: \"kubernetes.io/projected/24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e-kube-api-access-bqfxh\") pod \"infra-operator-controller-manager-59886d58dd-rdcd7\" (UID: \"24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e\") " pod="openstack-operators/infra-operator-controller-manager-59886d58dd-rdcd7" Oct 02 09:47:58 crc kubenswrapper[4722]: I1002 09:47:58.682647 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-59886d58dd-rdcd7" Oct 02 09:47:59 crc kubenswrapper[4722]: I1002 09:47:59.061377 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-59886d58dd-rdcd7"] Oct 02 09:47:59 crc kubenswrapper[4722]: I1002 09:47:59.661771 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-59886d58dd-rdcd7" event={"ID":"24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e","Type":"ContainerStarted","Data":"b13062f25b60e69fadf625de8cf87f55125f6de82f12f53793c77009ce014625"} Oct 02 09:48:01 crc kubenswrapper[4722]: I1002 09:48:01.678944 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-59886d58dd-rdcd7" event={"ID":"24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e","Type":"ContainerStarted","Data":"783cdf3d622013b524850950ada54253d806c06c748cb6d5806fff3f4eb65d28"} Oct 02 09:48:01 crc kubenswrapper[4722]: I1002 09:48:01.679352 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-59886d58dd-rdcd7" Oct 02 09:48:01 crc kubenswrapper[4722]: I1002 09:48:01.679374 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-59886d58dd-rdcd7" event={"ID":"24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e","Type":"ContainerStarted","Data":"f6e02f3288c1a993a325db0a9f31a9109bc21c6745d48f83704bc74667165b71"} Oct 02 09:48:01 crc kubenswrapper[4722]: I1002 09:48:01.705718 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-59886d58dd-rdcd7" podStartSLOduration=1.615583964 podStartE2EDuration="3.705686792s" podCreationTimestamp="2025-10-02 09:47:58 +0000 UTC" firstStartedPulling="2025-10-02 09:47:59.079502868 +0000 UTC m=+935.116255848" lastFinishedPulling="2025-10-02 09:48:01.169605696 +0000 UTC m=+937.206358676" observedRunningTime="2025-10-02 09:48:01.702605005 +0000 UTC m=+937.739358005" watchObservedRunningTime="2025-10-02 09:48:01.705686792 +0000 UTC m=+937.742439772" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.498128 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/openstack-galera-0"] Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.498991 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-0" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.503523 4722 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"galera-openstack-dockercfg-slr4k" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.503763 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"barbican-kuttl-tests"/"kube-root-ca.crt" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.503897 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"barbican-kuttl-tests"/"openstack-scripts" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.504093 4722 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"osp-secret" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.504226 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"barbican-kuttl-tests"/"openshift-service-ca.crt" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.505557 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"barbican-kuttl-tests"/"openstack-config-data" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.507663 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/openstack-galera-1"] Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.508689 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-1" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.514611 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/openstack-galera-2"] Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.515610 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-2" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.518967 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/openstack-galera-0"] Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.538088 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/openstack-galera-2"] Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.546006 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/openstack-galera-1"] Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.617205 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/9103aa77-3c03-4ece-8293-b6ccd66a0cbb-secrets\") pod \"openstack-galera-0\" (UID: \"9103aa77-3c03-4ece-8293-b6ccd66a0cbb\") " pod="barbican-kuttl-tests/openstack-galera-0" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.617276 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9103aa77-3c03-4ece-8293-b6ccd66a0cbb-operator-scripts\") pod \"openstack-galera-0\" (UID: \"9103aa77-3c03-4ece-8293-b6ccd66a0cbb\") " pod="barbican-kuttl-tests/openstack-galera-0" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.617422 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7cb4f813-1f97-4272-943d-05fa71c599dc-kolla-config\") pod \"openstack-galera-2\" (UID: \"7cb4f813-1f97-4272-943d-05fa71c599dc\") " pod="barbican-kuttl-tests/openstack-galera-2" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.617476 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7cb4f813-1f97-4272-943d-05fa71c599dc-config-data-default\") pod \"openstack-galera-2\" (UID: \"7cb4f813-1f97-4272-943d-05fa71c599dc\") " pod="barbican-kuttl-tests/openstack-galera-2" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.617520 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4383f5f8-eb32-4c6b-a3a5-0943a567e856-config-data-default\") pod \"openstack-galera-1\" (UID: \"4383f5f8-eb32-4c6b-a3a5-0943a567e856\") " pod="barbican-kuttl-tests/openstack-galera-1" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.617540 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9103aa77-3c03-4ece-8293-b6ccd66a0cbb-kolla-config\") pod \"openstack-galera-0\" (UID: \"9103aa77-3c03-4ece-8293-b6ccd66a0cbb\") " pod="barbican-kuttl-tests/openstack-galera-0" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.617567 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9103aa77-3c03-4ece-8293-b6ccd66a0cbb-config-data-generated\") pod \"openstack-galera-0\" (UID: \"9103aa77-3c03-4ece-8293-b6ccd66a0cbb\") " pod="barbican-kuttl-tests/openstack-galera-0" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.617612 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7cb4f813-1f97-4272-943d-05fa71c599dc-config-data-generated\") pod \"openstack-galera-2\" (UID: \"7cb4f813-1f97-4272-943d-05fa71c599dc\") " pod="barbican-kuttl-tests/openstack-galera-2" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.617638 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-2\" (UID: \"7cb4f813-1f97-4272-943d-05fa71c599dc\") " pod="barbican-kuttl-tests/openstack-galera-2" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.617667 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7cb4f813-1f97-4272-943d-05fa71c599dc-operator-scripts\") pod \"openstack-galera-2\" (UID: \"7cb4f813-1f97-4272-943d-05fa71c599dc\") " pod="barbican-kuttl-tests/openstack-galera-2" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.617685 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw79b\" (UniqueName: \"kubernetes.io/projected/9103aa77-3c03-4ece-8293-b6ccd66a0cbb-kube-api-access-dw79b\") pod \"openstack-galera-0\" (UID: \"9103aa77-3c03-4ece-8293-b6ccd66a0cbb\") " pod="barbican-kuttl-tests/openstack-galera-0" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.617699 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4383f5f8-eb32-4c6b-a3a5-0943a567e856-operator-scripts\") pod \"openstack-galera-1\" (UID: \"4383f5f8-eb32-4c6b-a3a5-0943a567e856\") " pod="barbican-kuttl-tests/openstack-galera-1" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.617727 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4383f5f8-eb32-4c6b-a3a5-0943a567e856-kolla-config\") pod \"openstack-galera-1\" (UID: \"4383f5f8-eb32-4c6b-a3a5-0943a567e856\") " pod="barbican-kuttl-tests/openstack-galera-1" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.617745 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4383f5f8-eb32-4c6b-a3a5-0943a567e856-config-data-generated\") pod \"openstack-galera-1\" (UID: \"4383f5f8-eb32-4c6b-a3a5-0943a567e856\") " pod="barbican-kuttl-tests/openstack-galera-1" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.617762 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/7cb4f813-1f97-4272-943d-05fa71c599dc-secrets\") pod \"openstack-galera-2\" (UID: \"7cb4f813-1f97-4272-943d-05fa71c599dc\") " pod="barbican-kuttl-tests/openstack-galera-2" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.617780 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-1\" (UID: \"4383f5f8-eb32-4c6b-a3a5-0943a567e856\") " pod="barbican-kuttl-tests/openstack-galera-1" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.617798 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9103aa77-3c03-4ece-8293-b6ccd66a0cbb-config-data-default\") pod \"openstack-galera-0\" (UID: \"9103aa77-3c03-4ece-8293-b6ccd66a0cbb\") " pod="barbican-kuttl-tests/openstack-galera-0" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.617813 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsv5d\" (UniqueName: \"kubernetes.io/projected/4383f5f8-eb32-4c6b-a3a5-0943a567e856-kube-api-access-gsv5d\") pod \"openstack-galera-1\" (UID: \"4383f5f8-eb32-4c6b-a3a5-0943a567e856\") " pod="barbican-kuttl-tests/openstack-galera-1" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.617840 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6gm2\" (UniqueName: \"kubernetes.io/projected/7cb4f813-1f97-4272-943d-05fa71c599dc-kube-api-access-z6gm2\") pod \"openstack-galera-2\" (UID: \"7cb4f813-1f97-4272-943d-05fa71c599dc\") " pod="barbican-kuttl-tests/openstack-galera-2" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.617859 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"9103aa77-3c03-4ece-8293-b6ccd66a0cbb\") " pod="barbican-kuttl-tests/openstack-galera-0" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.617875 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/4383f5f8-eb32-4c6b-a3a5-0943a567e856-secrets\") pod \"openstack-galera-1\" (UID: \"4383f5f8-eb32-4c6b-a3a5-0943a567e856\") " pod="barbican-kuttl-tests/openstack-galera-1" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.719018 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-1\" (UID: \"4383f5f8-eb32-4c6b-a3a5-0943a567e856\") " pod="barbican-kuttl-tests/openstack-galera-1" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.719121 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9103aa77-3c03-4ece-8293-b6ccd66a0cbb-config-data-default\") pod \"openstack-galera-0\" (UID: \"9103aa77-3c03-4ece-8293-b6ccd66a0cbb\") " pod="barbican-kuttl-tests/openstack-galera-0" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.719149 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsv5d\" (UniqueName: \"kubernetes.io/projected/4383f5f8-eb32-4c6b-a3a5-0943a567e856-kube-api-access-gsv5d\") pod \"openstack-galera-1\" (UID: \"4383f5f8-eb32-4c6b-a3a5-0943a567e856\") " pod="barbican-kuttl-tests/openstack-galera-1" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.719199 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6gm2\" (UniqueName: \"kubernetes.io/projected/7cb4f813-1f97-4272-943d-05fa71c599dc-kube-api-access-z6gm2\") pod \"openstack-galera-2\" (UID: \"7cb4f813-1f97-4272-943d-05fa71c599dc\") " pod="barbican-kuttl-tests/openstack-galera-2" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.719231 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"9103aa77-3c03-4ece-8293-b6ccd66a0cbb\") " pod="barbican-kuttl-tests/openstack-galera-0" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.719260 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/4383f5f8-eb32-4c6b-a3a5-0943a567e856-secrets\") pod \"openstack-galera-1\" (UID: \"4383f5f8-eb32-4c6b-a3a5-0943a567e856\") " pod="barbican-kuttl-tests/openstack-galera-1" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.719292 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/9103aa77-3c03-4ece-8293-b6ccd66a0cbb-secrets\") pod \"openstack-galera-0\" (UID: \"9103aa77-3c03-4ece-8293-b6ccd66a0cbb\") " pod="barbican-kuttl-tests/openstack-galera-0" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.719586 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-1\" (UID: \"4383f5f8-eb32-4c6b-a3a5-0943a567e856\") device mount path \"/mnt/openstack/pv06\"" pod="barbican-kuttl-tests/openstack-galera-1" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.719716 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"9103aa77-3c03-4ece-8293-b6ccd66a0cbb\") device mount path \"/mnt/openstack/pv02\"" pod="barbican-kuttl-tests/openstack-galera-0" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.720203 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9103aa77-3c03-4ece-8293-b6ccd66a0cbb-config-data-default\") pod \"openstack-galera-0\" (UID: \"9103aa77-3c03-4ece-8293-b6ccd66a0cbb\") " pod="barbican-kuttl-tests/openstack-galera-0" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.720963 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9103aa77-3c03-4ece-8293-b6ccd66a0cbb-operator-scripts\") pod \"openstack-galera-0\" (UID: \"9103aa77-3c03-4ece-8293-b6ccd66a0cbb\") " pod="barbican-kuttl-tests/openstack-galera-0" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.725532 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9103aa77-3c03-4ece-8293-b6ccd66a0cbb-operator-scripts\") pod \"openstack-galera-0\" (UID: \"9103aa77-3c03-4ece-8293-b6ccd66a0cbb\") " pod="barbican-kuttl-tests/openstack-galera-0" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.726020 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7cb4f813-1f97-4272-943d-05fa71c599dc-kolla-config\") pod \"openstack-galera-2\" (UID: \"7cb4f813-1f97-4272-943d-05fa71c599dc\") " pod="barbican-kuttl-tests/openstack-galera-2" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.726050 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7cb4f813-1f97-4272-943d-05fa71c599dc-config-data-default\") pod \"openstack-galera-2\" (UID: \"7cb4f813-1f97-4272-943d-05fa71c599dc\") " pod="barbican-kuttl-tests/openstack-galera-2" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.726070 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4383f5f8-eb32-4c6b-a3a5-0943a567e856-config-data-default\") pod \"openstack-galera-1\" (UID: \"4383f5f8-eb32-4c6b-a3a5-0943a567e856\") " pod="barbican-kuttl-tests/openstack-galera-1" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.726091 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9103aa77-3c03-4ece-8293-b6ccd66a0cbb-kolla-config\") pod \"openstack-galera-0\" (UID: \"9103aa77-3c03-4ece-8293-b6ccd66a0cbb\") " pod="barbican-kuttl-tests/openstack-galera-0" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.726120 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9103aa77-3c03-4ece-8293-b6ccd66a0cbb-config-data-generated\") pod \"openstack-galera-0\" (UID: \"9103aa77-3c03-4ece-8293-b6ccd66a0cbb\") " pod="barbican-kuttl-tests/openstack-galera-0" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.726144 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7cb4f813-1f97-4272-943d-05fa71c599dc-config-data-generated\") pod \"openstack-galera-2\" (UID: \"7cb4f813-1f97-4272-943d-05fa71c599dc\") " pod="barbican-kuttl-tests/openstack-galera-2" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.726180 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-2\" (UID: \"7cb4f813-1f97-4272-943d-05fa71c599dc\") " pod="barbican-kuttl-tests/openstack-galera-2" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.726219 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7cb4f813-1f97-4272-943d-05fa71c599dc-operator-scripts\") pod \"openstack-galera-2\" (UID: \"7cb4f813-1f97-4272-943d-05fa71c599dc\") " pod="barbican-kuttl-tests/openstack-galera-2" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.726240 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw79b\" (UniqueName: \"kubernetes.io/projected/9103aa77-3c03-4ece-8293-b6ccd66a0cbb-kube-api-access-dw79b\") pod \"openstack-galera-0\" (UID: \"9103aa77-3c03-4ece-8293-b6ccd66a0cbb\") " pod="barbican-kuttl-tests/openstack-galera-0" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.726259 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4383f5f8-eb32-4c6b-a3a5-0943a567e856-operator-scripts\") pod \"openstack-galera-1\" (UID: \"4383f5f8-eb32-4c6b-a3a5-0943a567e856\") " pod="barbican-kuttl-tests/openstack-galera-1" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.726304 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4383f5f8-eb32-4c6b-a3a5-0943a567e856-kolla-config\") pod \"openstack-galera-1\" (UID: \"4383f5f8-eb32-4c6b-a3a5-0943a567e856\") " pod="barbican-kuttl-tests/openstack-galera-1" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.726341 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4383f5f8-eb32-4c6b-a3a5-0943a567e856-config-data-generated\") pod \"openstack-galera-1\" (UID: \"4383f5f8-eb32-4c6b-a3a5-0943a567e856\") " pod="barbican-kuttl-tests/openstack-galera-1" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.726367 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/7cb4f813-1f97-4272-943d-05fa71c599dc-secrets\") pod \"openstack-galera-2\" (UID: \"7cb4f813-1f97-4272-943d-05fa71c599dc\") " pod="barbican-kuttl-tests/openstack-galera-2" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.726574 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-2\" (UID: \"7cb4f813-1f97-4272-943d-05fa71c599dc\") device mount path \"/mnt/openstack/pv01\"" pod="barbican-kuttl-tests/openstack-galera-2" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.726798 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7cb4f813-1f97-4272-943d-05fa71c599dc-kolla-config\") pod \"openstack-galera-2\" (UID: \"7cb4f813-1f97-4272-943d-05fa71c599dc\") " pod="barbican-kuttl-tests/openstack-galera-2" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.727064 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7cb4f813-1f97-4272-943d-05fa71c599dc-config-data-default\") pod \"openstack-galera-2\" (UID: \"7cb4f813-1f97-4272-943d-05fa71c599dc\") " pod="barbican-kuttl-tests/openstack-galera-2" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.727231 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9103aa77-3c03-4ece-8293-b6ccd66a0cbb-kolla-config\") pod \"openstack-galera-0\" (UID: \"9103aa77-3c03-4ece-8293-b6ccd66a0cbb\") " pod="barbican-kuttl-tests/openstack-galera-0" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.728090 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4383f5f8-eb32-4c6b-a3a5-0943a567e856-config-data-default\") pod \"openstack-galera-1\" (UID: \"4383f5f8-eb32-4c6b-a3a5-0943a567e856\") " pod="barbican-kuttl-tests/openstack-galera-1" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.728407 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9103aa77-3c03-4ece-8293-b6ccd66a0cbb-config-data-generated\") pod \"openstack-galera-0\" (UID: \"9103aa77-3c03-4ece-8293-b6ccd66a0cbb\") " pod="barbican-kuttl-tests/openstack-galera-0" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.728611 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7cb4f813-1f97-4272-943d-05fa71c599dc-config-data-generated\") pod \"openstack-galera-2\" (UID: \"7cb4f813-1f97-4272-943d-05fa71c599dc\") " pod="barbican-kuttl-tests/openstack-galera-2" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.728934 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4383f5f8-eb32-4c6b-a3a5-0943a567e856-kolla-config\") pod \"openstack-galera-1\" (UID: \"4383f5f8-eb32-4c6b-a3a5-0943a567e856\") " pod="barbican-kuttl-tests/openstack-galera-1" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.729054 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/4383f5f8-eb32-4c6b-a3a5-0943a567e856-secrets\") pod \"openstack-galera-1\" (UID: \"4383f5f8-eb32-4c6b-a3a5-0943a567e856\") " pod="barbican-kuttl-tests/openstack-galera-1" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.729147 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/9103aa77-3c03-4ece-8293-b6ccd66a0cbb-secrets\") pod \"openstack-galera-0\" (UID: \"9103aa77-3c03-4ece-8293-b6ccd66a0cbb\") " pod="barbican-kuttl-tests/openstack-galera-0" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.729225 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4383f5f8-eb32-4c6b-a3a5-0943a567e856-config-data-generated\") pod \"openstack-galera-1\" (UID: \"4383f5f8-eb32-4c6b-a3a5-0943a567e856\") " pod="barbican-kuttl-tests/openstack-galera-1" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.736114 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7cb4f813-1f97-4272-943d-05fa71c599dc-operator-scripts\") pod \"openstack-galera-2\" (UID: \"7cb4f813-1f97-4272-943d-05fa71c599dc\") " pod="barbican-kuttl-tests/openstack-galera-2" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.742519 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4383f5f8-eb32-4c6b-a3a5-0943a567e856-operator-scripts\") pod \"openstack-galera-1\" (UID: \"4383f5f8-eb32-4c6b-a3a5-0943a567e856\") " pod="barbican-kuttl-tests/openstack-galera-1" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.743718 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/7cb4f813-1f97-4272-943d-05fa71c599dc-secrets\") pod \"openstack-galera-2\" (UID: \"7cb4f813-1f97-4272-943d-05fa71c599dc\") " pod="barbican-kuttl-tests/openstack-galera-2" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.744473 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6gm2\" (UniqueName: \"kubernetes.io/projected/7cb4f813-1f97-4272-943d-05fa71c599dc-kube-api-access-z6gm2\") pod \"openstack-galera-2\" (UID: \"7cb4f813-1f97-4272-943d-05fa71c599dc\") " pod="barbican-kuttl-tests/openstack-galera-2" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.745677 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw79b\" (UniqueName: \"kubernetes.io/projected/9103aa77-3c03-4ece-8293-b6ccd66a0cbb-kube-api-access-dw79b\") pod \"openstack-galera-0\" (UID: \"9103aa77-3c03-4ece-8293-b6ccd66a0cbb\") " pod="barbican-kuttl-tests/openstack-galera-0" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.745685 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-2\" (UID: \"7cb4f813-1f97-4272-943d-05fa71c599dc\") " pod="barbican-kuttl-tests/openstack-galera-2" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.745688 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"9103aa77-3c03-4ece-8293-b6ccd66a0cbb\") " pod="barbican-kuttl-tests/openstack-galera-0" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.746325 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsv5d\" (UniqueName: \"kubernetes.io/projected/4383f5f8-eb32-4c6b-a3a5-0943a567e856-kube-api-access-gsv5d\") pod \"openstack-galera-1\" (UID: \"4383f5f8-eb32-4c6b-a3a5-0943a567e856\") " pod="barbican-kuttl-tests/openstack-galera-1" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.746962 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-1\" (UID: \"4383f5f8-eb32-4c6b-a3a5-0943a567e856\") " pod="barbican-kuttl-tests/openstack-galera-1" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.817056 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-0" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.832924 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-1" Oct 02 09:48:03 crc kubenswrapper[4722]: I1002 09:48:03.846282 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-2" Oct 02 09:48:04 crc kubenswrapper[4722]: I1002 09:48:04.176113 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/openstack-galera-2"] Oct 02 09:48:04 crc kubenswrapper[4722]: W1002 09:48:04.185195 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7cb4f813_1f97_4272_943d_05fa71c599dc.slice/crio-b423cb8a141a570a472bfc35562351bd1d37f95bb306a5ddce57e793b4f52835 WatchSource:0}: Error finding container b423cb8a141a570a472bfc35562351bd1d37f95bb306a5ddce57e793b4f52835: Status 404 returned error can't find the container with id b423cb8a141a570a472bfc35562351bd1d37f95bb306a5ddce57e793b4f52835 Oct 02 09:48:04 crc kubenswrapper[4722]: I1002 09:48:04.298150 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/openstack-galera-1"] Oct 02 09:48:04 crc kubenswrapper[4722]: I1002 09:48:04.304721 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/openstack-galera-0"] Oct 02 09:48:04 crc kubenswrapper[4722]: W1002 09:48:04.308960 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9103aa77_3c03_4ece_8293_b6ccd66a0cbb.slice/crio-e28c0db862557fe74826f533b34b811ddf23021242af250ea413cb4a9fcaeed7 WatchSource:0}: Error finding container e28c0db862557fe74826f533b34b811ddf23021242af250ea413cb4a9fcaeed7: Status 404 returned error can't find the container with id e28c0db862557fe74826f533b34b811ddf23021242af250ea413cb4a9fcaeed7 Oct 02 09:48:04 crc kubenswrapper[4722]: I1002 09:48:04.702106 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-2" event={"ID":"7cb4f813-1f97-4272-943d-05fa71c599dc","Type":"ContainerStarted","Data":"b423cb8a141a570a472bfc35562351bd1d37f95bb306a5ddce57e793b4f52835"} Oct 02 09:48:04 crc kubenswrapper[4722]: I1002 09:48:04.703263 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-0" event={"ID":"9103aa77-3c03-4ece-8293-b6ccd66a0cbb","Type":"ContainerStarted","Data":"e28c0db862557fe74826f533b34b811ddf23021242af250ea413cb4a9fcaeed7"} Oct 02 09:48:04 crc kubenswrapper[4722]: I1002 09:48:04.704504 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-1" event={"ID":"4383f5f8-eb32-4c6b-a3a5-0943a567e856","Type":"ContainerStarted","Data":"286716b78b4286c5d54750dc243736f103ba7d7e2f8db7d3696f69153d8a37c1"} Oct 02 09:48:08 crc kubenswrapper[4722]: I1002 09:48:08.687437 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-59886d58dd-rdcd7" Oct 02 09:48:11 crc kubenswrapper[4722]: I1002 09:48:11.325629 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/memcached-0"] Oct 02 09:48:11 crc kubenswrapper[4722]: I1002 09:48:11.328898 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/memcached-0" Oct 02 09:48:11 crc kubenswrapper[4722]: I1002 09:48:11.332308 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"barbican-kuttl-tests"/"memcached-config-data" Oct 02 09:48:11 crc kubenswrapper[4722]: I1002 09:48:11.333149 4722 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"memcached-memcached-dockercfg-j2lnt" Oct 02 09:48:11 crc kubenswrapper[4722]: I1002 09:48:11.348716 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/memcached-0"] Oct 02 09:48:11 crc kubenswrapper[4722]: I1002 09:48:11.473549 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2z69\" (UniqueName: \"kubernetes.io/projected/b4d42537-11aa-4a12-8137-feca0a8d889b-kube-api-access-t2z69\") pod \"memcached-0\" (UID: \"b4d42537-11aa-4a12-8137-feca0a8d889b\") " pod="barbican-kuttl-tests/memcached-0" Oct 02 09:48:11 crc kubenswrapper[4722]: I1002 09:48:11.473646 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b4d42537-11aa-4a12-8137-feca0a8d889b-config-data\") pod \"memcached-0\" (UID: \"b4d42537-11aa-4a12-8137-feca0a8d889b\") " pod="barbican-kuttl-tests/memcached-0" Oct 02 09:48:11 crc kubenswrapper[4722]: I1002 09:48:11.473676 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b4d42537-11aa-4a12-8137-feca0a8d889b-kolla-config\") pod \"memcached-0\" (UID: \"b4d42537-11aa-4a12-8137-feca0a8d889b\") " pod="barbican-kuttl-tests/memcached-0" Oct 02 09:48:11 crc kubenswrapper[4722]: I1002 09:48:11.583219 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b4d42537-11aa-4a12-8137-feca0a8d889b-config-data\") pod \"memcached-0\" (UID: \"b4d42537-11aa-4a12-8137-feca0a8d889b\") " pod="barbican-kuttl-tests/memcached-0" Oct 02 09:48:11 crc kubenswrapper[4722]: I1002 09:48:11.583399 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b4d42537-11aa-4a12-8137-feca0a8d889b-kolla-config\") pod \"memcached-0\" (UID: \"b4d42537-11aa-4a12-8137-feca0a8d889b\") " pod="barbican-kuttl-tests/memcached-0" Oct 02 09:48:11 crc kubenswrapper[4722]: I1002 09:48:11.583481 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2z69\" (UniqueName: \"kubernetes.io/projected/b4d42537-11aa-4a12-8137-feca0a8d889b-kube-api-access-t2z69\") pod \"memcached-0\" (UID: \"b4d42537-11aa-4a12-8137-feca0a8d889b\") " pod="barbican-kuttl-tests/memcached-0" Oct 02 09:48:11 crc kubenswrapper[4722]: I1002 09:48:11.585037 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b4d42537-11aa-4a12-8137-feca0a8d889b-config-data\") pod \"memcached-0\" (UID: \"b4d42537-11aa-4a12-8137-feca0a8d889b\") " pod="barbican-kuttl-tests/memcached-0" Oct 02 09:48:11 crc kubenswrapper[4722]: I1002 09:48:11.585928 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b4d42537-11aa-4a12-8137-feca0a8d889b-kolla-config\") pod \"memcached-0\" (UID: \"b4d42537-11aa-4a12-8137-feca0a8d889b\") " pod="barbican-kuttl-tests/memcached-0" Oct 02 09:48:11 crc kubenswrapper[4722]: I1002 09:48:11.612094 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2z69\" (UniqueName: \"kubernetes.io/projected/b4d42537-11aa-4a12-8137-feca0a8d889b-kube-api-access-t2z69\") pod \"memcached-0\" (UID: \"b4d42537-11aa-4a12-8137-feca0a8d889b\") " pod="barbican-kuttl-tests/memcached-0" Oct 02 09:48:11 crc kubenswrapper[4722]: I1002 09:48:11.660188 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/memcached-0" Oct 02 09:48:13 crc kubenswrapper[4722]: I1002 09:48:13.564135 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/memcached-0"] Oct 02 09:48:13 crc kubenswrapper[4722]: W1002 09:48:13.585739 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4d42537_11aa_4a12_8137_feca0a8d889b.slice/crio-276e0918463e11b8164acda9abbb1a7d0163a0fa86b696f8da313894e24e5619 WatchSource:0}: Error finding container 276e0918463e11b8164acda9abbb1a7d0163a0fa86b696f8da313894e24e5619: Status 404 returned error can't find the container with id 276e0918463e11b8164acda9abbb1a7d0163a0fa86b696f8da313894e24e5619 Oct 02 09:48:13 crc kubenswrapper[4722]: I1002 09:48:13.781721 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/memcached-0" event={"ID":"b4d42537-11aa-4a12-8137-feca0a8d889b","Type":"ContainerStarted","Data":"276e0918463e11b8164acda9abbb1a7d0163a0fa86b696f8da313894e24e5619"} Oct 02 09:48:13 crc kubenswrapper[4722]: I1002 09:48:13.783067 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-2" event={"ID":"7cb4f813-1f97-4272-943d-05fa71c599dc","Type":"ContainerStarted","Data":"f8f3b68509ddf3b0a710af7d61652e8bda12426666c8786f947bc643326fa9eb"} Oct 02 09:48:13 crc kubenswrapper[4722]: I1002 09:48:13.784759 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-0" event={"ID":"9103aa77-3c03-4ece-8293-b6ccd66a0cbb","Type":"ContainerStarted","Data":"965305ed0523398949d9dbb1145606f185b4cc3da22f527a1e578d69692b11f8"} Oct 02 09:48:13 crc kubenswrapper[4722]: I1002 09:48:13.786643 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-1" event={"ID":"4383f5f8-eb32-4c6b-a3a5-0943a567e856","Type":"ContainerStarted","Data":"014a88a289d5137c4d5f9dff03cce6ac80c19ec64858e10f0244424e9dbdcbc2"} Oct 02 09:48:14 crc kubenswrapper[4722]: I1002 09:48:14.275043 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-wm7rf"] Oct 02 09:48:14 crc kubenswrapper[4722]: I1002 09:48:14.275930 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-wm7rf" Oct 02 09:48:14 crc kubenswrapper[4722]: I1002 09:48:14.277922 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-index-dockercfg-85zrm" Oct 02 09:48:14 crc kubenswrapper[4722]: I1002 09:48:14.288241 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-wm7rf"] Oct 02 09:48:14 crc kubenswrapper[4722]: I1002 09:48:14.431159 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmzs4\" (UniqueName: \"kubernetes.io/projected/e762843d-7d09-47b6-bda1-92876c1eae97-kube-api-access-cmzs4\") pod \"rabbitmq-cluster-operator-index-wm7rf\" (UID: \"e762843d-7d09-47b6-bda1-92876c1eae97\") " pod="openstack-operators/rabbitmq-cluster-operator-index-wm7rf" Oct 02 09:48:14 crc kubenswrapper[4722]: I1002 09:48:14.532362 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmzs4\" (UniqueName: \"kubernetes.io/projected/e762843d-7d09-47b6-bda1-92876c1eae97-kube-api-access-cmzs4\") pod \"rabbitmq-cluster-operator-index-wm7rf\" (UID: \"e762843d-7d09-47b6-bda1-92876c1eae97\") " pod="openstack-operators/rabbitmq-cluster-operator-index-wm7rf" Oct 02 09:48:14 crc kubenswrapper[4722]: I1002 09:48:14.559558 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmzs4\" (UniqueName: \"kubernetes.io/projected/e762843d-7d09-47b6-bda1-92876c1eae97-kube-api-access-cmzs4\") pod \"rabbitmq-cluster-operator-index-wm7rf\" (UID: \"e762843d-7d09-47b6-bda1-92876c1eae97\") " pod="openstack-operators/rabbitmq-cluster-operator-index-wm7rf" Oct 02 09:48:14 crc kubenswrapper[4722]: I1002 09:48:14.598464 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-wm7rf" Oct 02 09:48:15 crc kubenswrapper[4722]: I1002 09:48:15.109846 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-wm7rf"] Oct 02 09:48:15 crc kubenswrapper[4722]: W1002 09:48:15.137619 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode762843d_7d09_47b6_bda1_92876c1eae97.slice/crio-0d91fb651777c47475791d6ff13cf888179c2c9368ba217c7d2b2741b61e0a0c WatchSource:0}: Error finding container 0d91fb651777c47475791d6ff13cf888179c2c9368ba217c7d2b2741b61e0a0c: Status 404 returned error can't find the container with id 0d91fb651777c47475791d6ff13cf888179c2c9368ba217c7d2b2741b61e0a0c Oct 02 09:48:15 crc kubenswrapper[4722]: I1002 09:48:15.819484 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-wm7rf" event={"ID":"e762843d-7d09-47b6-bda1-92876c1eae97","Type":"ContainerStarted","Data":"0d91fb651777c47475791d6ff13cf888179c2c9368ba217c7d2b2741b61e0a0c"} Oct 02 09:48:18 crc kubenswrapper[4722]: I1002 09:48:18.847469 4722 generic.go:334] "Generic (PLEG): container finished" podID="9103aa77-3c03-4ece-8293-b6ccd66a0cbb" containerID="965305ed0523398949d9dbb1145606f185b4cc3da22f527a1e578d69692b11f8" exitCode=0 Oct 02 09:48:18 crc kubenswrapper[4722]: I1002 09:48:18.847644 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-0" event={"ID":"9103aa77-3c03-4ece-8293-b6ccd66a0cbb","Type":"ContainerDied","Data":"965305ed0523398949d9dbb1145606f185b4cc3da22f527a1e578d69692b11f8"} Oct 02 09:48:19 crc kubenswrapper[4722]: I1002 09:48:19.855286 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-0" event={"ID":"9103aa77-3c03-4ece-8293-b6ccd66a0cbb","Type":"ContainerStarted","Data":"619f84a5809b1668a3059ede311f159e1c13246085df2510df54360dd2ef4078"} Oct 02 09:48:19 crc kubenswrapper[4722]: I1002 09:48:19.856701 4722 generic.go:334] "Generic (PLEG): container finished" podID="4383f5f8-eb32-4c6b-a3a5-0943a567e856" containerID="014a88a289d5137c4d5f9dff03cce6ac80c19ec64858e10f0244424e9dbdcbc2" exitCode=0 Oct 02 09:48:19 crc kubenswrapper[4722]: I1002 09:48:19.856758 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-1" event={"ID":"4383f5f8-eb32-4c6b-a3a5-0943a567e856","Type":"ContainerDied","Data":"014a88a289d5137c4d5f9dff03cce6ac80c19ec64858e10f0244424e9dbdcbc2"} Oct 02 09:48:19 crc kubenswrapper[4722]: I1002 09:48:19.875583 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/openstack-galera-0" podStartSLOduration=8.957263238 podStartE2EDuration="17.875562168s" podCreationTimestamp="2025-10-02 09:48:02 +0000 UTC" firstStartedPulling="2025-10-02 09:48:04.314520484 +0000 UTC m=+940.351273464" lastFinishedPulling="2025-10-02 09:48:13.232819414 +0000 UTC m=+949.269572394" observedRunningTime="2025-10-02 09:48:19.871665601 +0000 UTC m=+955.908418591" watchObservedRunningTime="2025-10-02 09:48:19.875562168 +0000 UTC m=+955.912315148" Oct 02 09:48:20 crc kubenswrapper[4722]: I1002 09:48:20.864142 4722 generic.go:334] "Generic (PLEG): container finished" podID="7cb4f813-1f97-4272-943d-05fa71c599dc" containerID="f8f3b68509ddf3b0a710af7d61652e8bda12426666c8786f947bc643326fa9eb" exitCode=0 Oct 02 09:48:20 crc kubenswrapper[4722]: I1002 09:48:20.864186 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-2" event={"ID":"7cb4f813-1f97-4272-943d-05fa71c599dc","Type":"ContainerDied","Data":"f8f3b68509ddf3b0a710af7d61652e8bda12426666c8786f947bc643326fa9eb"} Oct 02 09:48:23 crc kubenswrapper[4722]: I1002 09:48:23.818428 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/openstack-galera-0" Oct 02 09:48:23 crc kubenswrapper[4722]: I1002 09:48:23.819712 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="barbican-kuttl-tests/openstack-galera-0" Oct 02 09:48:25 crc kubenswrapper[4722]: I1002 09:48:25.902157 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-2" event={"ID":"7cb4f813-1f97-4272-943d-05fa71c599dc","Type":"ContainerStarted","Data":"e85c3885111193ba4c7cee2ba1ae687547e4e42c47914248d94caff929f43be1"} Oct 02 09:48:25 crc kubenswrapper[4722]: I1002 09:48:25.907448 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-1" event={"ID":"4383f5f8-eb32-4c6b-a3a5-0943a567e856","Type":"ContainerStarted","Data":"2230e2db400f30c4e67d870f42bef7d57c323d5b014b2863ebfd568758552b96"} Oct 02 09:48:25 crc kubenswrapper[4722]: I1002 09:48:25.928053 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/openstack-galera-2" podStartSLOduration=14.776720005 podStartE2EDuration="23.928035052s" podCreationTimestamp="2025-10-02 09:48:02 +0000 UTC" firstStartedPulling="2025-10-02 09:48:04.188248408 +0000 UTC m=+940.225001388" lastFinishedPulling="2025-10-02 09:48:13.339563455 +0000 UTC m=+949.376316435" observedRunningTime="2025-10-02 09:48:25.927174371 +0000 UTC m=+961.963927371" watchObservedRunningTime="2025-10-02 09:48:25.928035052 +0000 UTC m=+961.964788032" Oct 02 09:48:25 crc kubenswrapper[4722]: I1002 09:48:25.949343 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/openstack-galera-1" podStartSLOduration=15.019476474 podStartE2EDuration="23.949327311s" podCreationTimestamp="2025-10-02 09:48:02 +0000 UTC" firstStartedPulling="2025-10-02 09:48:04.304574507 +0000 UTC m=+940.341327487" lastFinishedPulling="2025-10-02 09:48:13.234425344 +0000 UTC m=+949.271178324" observedRunningTime="2025-10-02 09:48:25.947816794 +0000 UTC m=+961.984569784" watchObservedRunningTime="2025-10-02 09:48:25.949327311 +0000 UTC m=+961.986080291" Oct 02 09:48:26 crc kubenswrapper[4722]: I1002 09:48:26.917839 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-wm7rf" event={"ID":"e762843d-7d09-47b6-bda1-92876c1eae97","Type":"ContainerStarted","Data":"61373b612f1dd62956f8396dd52c983a3695a5c001d1e851680ed820f1912537"} Oct 02 09:48:26 crc kubenswrapper[4722]: I1002 09:48:26.920036 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/memcached-0" event={"ID":"b4d42537-11aa-4a12-8137-feca0a8d889b","Type":"ContainerStarted","Data":"86b6af3ea1cebe16e80a7c7f66941e145bbcd18f9388440526dd59ea8766e63f"} Oct 02 09:48:26 crc kubenswrapper[4722]: I1002 09:48:26.920504 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/memcached-0" Oct 02 09:48:26 crc kubenswrapper[4722]: I1002 09:48:26.936701 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-index-wm7rf" podStartSLOduration=1.9007543249999999 podStartE2EDuration="12.936676127s" podCreationTimestamp="2025-10-02 09:48:14 +0000 UTC" firstStartedPulling="2025-10-02 09:48:15.140298835 +0000 UTC m=+951.177051815" lastFinishedPulling="2025-10-02 09:48:26.176220627 +0000 UTC m=+962.212973617" observedRunningTime="2025-10-02 09:48:26.935756354 +0000 UTC m=+962.972509354" watchObservedRunningTime="2025-10-02 09:48:26.936676127 +0000 UTC m=+962.973429117" Oct 02 09:48:26 crc kubenswrapper[4722]: I1002 09:48:26.956261 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/memcached-0" podStartSLOduration=4.920207918 podStartE2EDuration="15.956232202s" podCreationTimestamp="2025-10-02 09:48:11 +0000 UTC" firstStartedPulling="2025-10-02 09:48:13.588686953 +0000 UTC m=+949.625439933" lastFinishedPulling="2025-10-02 09:48:24.624711237 +0000 UTC m=+960.661464217" observedRunningTime="2025-10-02 09:48:26.953644798 +0000 UTC m=+962.990397768" watchObservedRunningTime="2025-10-02 09:48:26.956232202 +0000 UTC m=+962.992985182" Oct 02 09:48:31 crc kubenswrapper[4722]: I1002 09:48:31.662491 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="barbican-kuttl-tests/memcached-0" Oct 02 09:48:33 crc kubenswrapper[4722]: I1002 09:48:33.833404 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="barbican-kuttl-tests/openstack-galera-1" Oct 02 09:48:33 crc kubenswrapper[4722]: I1002 09:48:33.833691 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/openstack-galera-1" Oct 02 09:48:33 crc kubenswrapper[4722]: I1002 09:48:33.847045 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/openstack-galera-2" Oct 02 09:48:33 crc kubenswrapper[4722]: I1002 09:48:33.847119 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="barbican-kuttl-tests/openstack-galera-2" Oct 02 09:48:33 crc kubenswrapper[4722]: I1002 09:48:33.894828 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="barbican-kuttl-tests/openstack-galera-2" Oct 02 09:48:34 crc kubenswrapper[4722]: I1002 09:48:34.137181 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="barbican-kuttl-tests/openstack-galera-2" Oct 02 09:48:34 crc kubenswrapper[4722]: I1002 09:48:34.599122 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/rabbitmq-cluster-operator-index-wm7rf" Oct 02 09:48:34 crc kubenswrapper[4722]: I1002 09:48:34.599536 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/rabbitmq-cluster-operator-index-wm7rf" Oct 02 09:48:34 crc kubenswrapper[4722]: I1002 09:48:34.630884 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/rabbitmq-cluster-operator-index-wm7rf" Oct 02 09:48:34 crc kubenswrapper[4722]: I1002 09:48:34.990682 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/rabbitmq-cluster-operator-index-wm7rf" Oct 02 09:48:40 crc kubenswrapper[4722]: I1002 09:48:40.697028 4722 patch_prober.go:28] interesting pod/machine-config-daemon-q6h76 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 09:48:40 crc kubenswrapper[4722]: I1002 09:48:40.698613 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" podUID="f662826c-e772-4f56-a85f-83971aaef356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 09:48:43 crc kubenswrapper[4722]: I1002 09:48:43.116531 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590btwlk"] Oct 02 09:48:43 crc kubenswrapper[4722]: I1002 09:48:43.117690 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590btwlk" Oct 02 09:48:43 crc kubenswrapper[4722]: I1002 09:48:43.119955 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-bc9s8" Oct 02 09:48:43 crc kubenswrapper[4722]: I1002 09:48:43.131508 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590btwlk"] Oct 02 09:48:43 crc kubenswrapper[4722]: I1002 09:48:43.230061 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9756f09b-bfeb-4b6e-806d-50602bdca92f-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590btwlk\" (UID: \"9756f09b-bfeb-4b6e-806d-50602bdca92f\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590btwlk" Oct 02 09:48:43 crc kubenswrapper[4722]: I1002 09:48:43.230170 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgj87\" (UniqueName: \"kubernetes.io/projected/9756f09b-bfeb-4b6e-806d-50602bdca92f-kube-api-access-xgj87\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590btwlk\" (UID: \"9756f09b-bfeb-4b6e-806d-50602bdca92f\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590btwlk" Oct 02 09:48:43 crc kubenswrapper[4722]: I1002 09:48:43.230224 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9756f09b-bfeb-4b6e-806d-50602bdca92f-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590btwlk\" (UID: \"9756f09b-bfeb-4b6e-806d-50602bdca92f\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590btwlk" Oct 02 09:48:43 crc kubenswrapper[4722]: I1002 09:48:43.331795 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9756f09b-bfeb-4b6e-806d-50602bdca92f-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590btwlk\" (UID: \"9756f09b-bfeb-4b6e-806d-50602bdca92f\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590btwlk" Oct 02 09:48:43 crc kubenswrapper[4722]: I1002 09:48:43.332105 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgj87\" (UniqueName: \"kubernetes.io/projected/9756f09b-bfeb-4b6e-806d-50602bdca92f-kube-api-access-xgj87\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590btwlk\" (UID: \"9756f09b-bfeb-4b6e-806d-50602bdca92f\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590btwlk" Oct 02 09:48:43 crc kubenswrapper[4722]: I1002 09:48:43.332170 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9756f09b-bfeb-4b6e-806d-50602bdca92f-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590btwlk\" (UID: \"9756f09b-bfeb-4b6e-806d-50602bdca92f\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590btwlk" Oct 02 09:48:43 crc kubenswrapper[4722]: I1002 09:48:43.332669 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9756f09b-bfeb-4b6e-806d-50602bdca92f-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590btwlk\" (UID: \"9756f09b-bfeb-4b6e-806d-50602bdca92f\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590btwlk" Oct 02 09:48:43 crc kubenswrapper[4722]: I1002 09:48:43.332714 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9756f09b-bfeb-4b6e-806d-50602bdca92f-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590btwlk\" (UID: \"9756f09b-bfeb-4b6e-806d-50602bdca92f\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590btwlk" Oct 02 09:48:43 crc kubenswrapper[4722]: I1002 09:48:43.358448 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgj87\" (UniqueName: \"kubernetes.io/projected/9756f09b-bfeb-4b6e-806d-50602bdca92f-kube-api-access-xgj87\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590btwlk\" (UID: \"9756f09b-bfeb-4b6e-806d-50602bdca92f\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590btwlk" Oct 02 09:48:43 crc kubenswrapper[4722]: I1002 09:48:43.436085 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590btwlk" Oct 02 09:48:43 crc kubenswrapper[4722]: I1002 09:48:43.887755 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="barbican-kuttl-tests/openstack-galera-2" podUID="7cb4f813-1f97-4272-943d-05fa71c599dc" containerName="galera" probeResult="failure" output=< Oct 02 09:48:43 crc kubenswrapper[4722]: wsrep_local_state_comment (Donor/Desynced) differs from Synced Oct 02 09:48:43 crc kubenswrapper[4722]: > Oct 02 09:48:43 crc kubenswrapper[4722]: I1002 09:48:43.944288 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590btwlk"] Oct 02 09:48:44 crc kubenswrapper[4722]: I1002 09:48:44.039065 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590btwlk" event={"ID":"9756f09b-bfeb-4b6e-806d-50602bdca92f","Type":"ContainerStarted","Data":"16ac98187a62e71acd478db896e4258168661a626e23d60b31161084255a346d"} Oct 02 09:48:45 crc kubenswrapper[4722]: I1002 09:48:45.047975 4722 generic.go:334] "Generic (PLEG): container finished" podID="9756f09b-bfeb-4b6e-806d-50602bdca92f" containerID="5f6438a7ccae3a1a60c661c9cfe870ab197b6051c1fe51f6ceba12f144989dd9" exitCode=0 Oct 02 09:48:45 crc kubenswrapper[4722]: I1002 09:48:45.048349 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590btwlk" event={"ID":"9756f09b-bfeb-4b6e-806d-50602bdca92f","Type":"ContainerDied","Data":"5f6438a7ccae3a1a60c661c9cfe870ab197b6051c1fe51f6ceba12f144989dd9"} Oct 02 09:48:45 crc kubenswrapper[4722]: I1002 09:48:45.778769 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="barbican-kuttl-tests/openstack-galera-1" Oct 02 09:48:45 crc kubenswrapper[4722]: I1002 09:48:45.828369 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="barbican-kuttl-tests/openstack-galera-1" Oct 02 09:48:47 crc kubenswrapper[4722]: I1002 09:48:47.066352 4722 generic.go:334] "Generic (PLEG): container finished" podID="9756f09b-bfeb-4b6e-806d-50602bdca92f" containerID="9dd14ccf921c3cac01dace2efe4378440157b80408f691d4dbff833eeac166e0" exitCode=0 Oct 02 09:48:47 crc kubenswrapper[4722]: I1002 09:48:47.066723 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590btwlk" event={"ID":"9756f09b-bfeb-4b6e-806d-50602bdca92f","Type":"ContainerDied","Data":"9dd14ccf921c3cac01dace2efe4378440157b80408f691d4dbff833eeac166e0"} Oct 02 09:48:48 crc kubenswrapper[4722]: I1002 09:48:48.077495 4722 generic.go:334] "Generic (PLEG): container finished" podID="9756f09b-bfeb-4b6e-806d-50602bdca92f" containerID="3f69325ff4c10ac54e84948534aabd30afbdd129881ae6199aa3b3e30e46b556" exitCode=0 Oct 02 09:48:48 crc kubenswrapper[4722]: I1002 09:48:48.077545 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590btwlk" event={"ID":"9756f09b-bfeb-4b6e-806d-50602bdca92f","Type":"ContainerDied","Data":"3f69325ff4c10ac54e84948534aabd30afbdd129881ae6199aa3b3e30e46b556"} Oct 02 09:48:48 crc kubenswrapper[4722]: I1002 09:48:48.240691 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="barbican-kuttl-tests/openstack-galera-0" Oct 02 09:48:48 crc kubenswrapper[4722]: I1002 09:48:48.285469 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="barbican-kuttl-tests/openstack-galera-0" Oct 02 09:48:49 crc kubenswrapper[4722]: I1002 09:48:49.364072 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590btwlk" Oct 02 09:48:49 crc kubenswrapper[4722]: I1002 09:48:49.431711 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9756f09b-bfeb-4b6e-806d-50602bdca92f-util\") pod \"9756f09b-bfeb-4b6e-806d-50602bdca92f\" (UID: \"9756f09b-bfeb-4b6e-806d-50602bdca92f\") " Oct 02 09:48:49 crc kubenswrapper[4722]: I1002 09:48:49.431880 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgj87\" (UniqueName: \"kubernetes.io/projected/9756f09b-bfeb-4b6e-806d-50602bdca92f-kube-api-access-xgj87\") pod \"9756f09b-bfeb-4b6e-806d-50602bdca92f\" (UID: \"9756f09b-bfeb-4b6e-806d-50602bdca92f\") " Oct 02 09:48:49 crc kubenswrapper[4722]: I1002 09:48:49.432026 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9756f09b-bfeb-4b6e-806d-50602bdca92f-bundle\") pod \"9756f09b-bfeb-4b6e-806d-50602bdca92f\" (UID: \"9756f09b-bfeb-4b6e-806d-50602bdca92f\") " Oct 02 09:48:49 crc kubenswrapper[4722]: I1002 09:48:49.432745 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9756f09b-bfeb-4b6e-806d-50602bdca92f-bundle" (OuterVolumeSpecName: "bundle") pod "9756f09b-bfeb-4b6e-806d-50602bdca92f" (UID: "9756f09b-bfeb-4b6e-806d-50602bdca92f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:48:49 crc kubenswrapper[4722]: I1002 09:48:49.437347 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9756f09b-bfeb-4b6e-806d-50602bdca92f-kube-api-access-xgj87" (OuterVolumeSpecName: "kube-api-access-xgj87") pod "9756f09b-bfeb-4b6e-806d-50602bdca92f" (UID: "9756f09b-bfeb-4b6e-806d-50602bdca92f"). InnerVolumeSpecName "kube-api-access-xgj87". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:48:49 crc kubenswrapper[4722]: I1002 09:48:49.533276 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgj87\" (UniqueName: \"kubernetes.io/projected/9756f09b-bfeb-4b6e-806d-50602bdca92f-kube-api-access-xgj87\") on node \"crc\" DevicePath \"\"" Oct 02 09:48:49 crc kubenswrapper[4722]: I1002 09:48:49.533327 4722 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9756f09b-bfeb-4b6e-806d-50602bdca92f-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 09:48:49 crc kubenswrapper[4722]: I1002 09:48:49.720737 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9756f09b-bfeb-4b6e-806d-50602bdca92f-util" (OuterVolumeSpecName: "util") pod "9756f09b-bfeb-4b6e-806d-50602bdca92f" (UID: "9756f09b-bfeb-4b6e-806d-50602bdca92f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:48:49 crc kubenswrapper[4722]: I1002 09:48:49.734861 4722 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9756f09b-bfeb-4b6e-806d-50602bdca92f-util\") on node \"crc\" DevicePath \"\"" Oct 02 09:48:50 crc kubenswrapper[4722]: I1002 09:48:50.094292 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590btwlk" event={"ID":"9756f09b-bfeb-4b6e-806d-50602bdca92f","Type":"ContainerDied","Data":"16ac98187a62e71acd478db896e4258168661a626e23d60b31161084255a346d"} Oct 02 09:48:50 crc kubenswrapper[4722]: I1002 09:48:50.094383 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16ac98187a62e71acd478db896e4258168661a626e23d60b31161084255a346d" Oct 02 09:48:50 crc kubenswrapper[4722]: I1002 09:48:50.094349 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590btwlk" Oct 02 09:48:59 crc kubenswrapper[4722]: I1002 09:48:59.418018 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-xnjdm"] Oct 02 09:48:59 crc kubenswrapper[4722]: E1002 09:48:59.418839 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9756f09b-bfeb-4b6e-806d-50602bdca92f" containerName="extract" Oct 02 09:48:59 crc kubenswrapper[4722]: I1002 09:48:59.418856 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9756f09b-bfeb-4b6e-806d-50602bdca92f" containerName="extract" Oct 02 09:48:59 crc kubenswrapper[4722]: E1002 09:48:59.418878 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9756f09b-bfeb-4b6e-806d-50602bdca92f" containerName="util" Oct 02 09:48:59 crc kubenswrapper[4722]: I1002 09:48:59.418886 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9756f09b-bfeb-4b6e-806d-50602bdca92f" containerName="util" Oct 02 09:48:59 crc kubenswrapper[4722]: E1002 09:48:59.418906 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9756f09b-bfeb-4b6e-806d-50602bdca92f" containerName="pull" Oct 02 09:48:59 crc kubenswrapper[4722]: I1002 09:48:59.418913 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9756f09b-bfeb-4b6e-806d-50602bdca92f" containerName="pull" Oct 02 09:48:59 crc kubenswrapper[4722]: I1002 09:48:59.419054 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="9756f09b-bfeb-4b6e-806d-50602bdca92f" containerName="extract" Oct 02 09:48:59 crc kubenswrapper[4722]: I1002 09:48:59.419585 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-xnjdm" Oct 02 09:48:59 crc kubenswrapper[4722]: I1002 09:48:59.422196 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-dockercfg-gxwft" Oct 02 09:48:59 crc kubenswrapper[4722]: I1002 09:48:59.428329 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-xnjdm"] Oct 02 09:48:59 crc kubenswrapper[4722]: I1002 09:48:59.463458 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqpz7\" (UniqueName: \"kubernetes.io/projected/812dc418-7dba-4a4c-9b9b-20eb9d801e0a-kube-api-access-dqpz7\") pod \"rabbitmq-cluster-operator-779fc9694b-xnjdm\" (UID: \"812dc418-7dba-4a4c-9b9b-20eb9d801e0a\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-xnjdm" Oct 02 09:48:59 crc kubenswrapper[4722]: I1002 09:48:59.565461 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqpz7\" (UniqueName: \"kubernetes.io/projected/812dc418-7dba-4a4c-9b9b-20eb9d801e0a-kube-api-access-dqpz7\") pod \"rabbitmq-cluster-operator-779fc9694b-xnjdm\" (UID: \"812dc418-7dba-4a4c-9b9b-20eb9d801e0a\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-xnjdm" Oct 02 09:48:59 crc kubenswrapper[4722]: I1002 09:48:59.598469 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqpz7\" (UniqueName: \"kubernetes.io/projected/812dc418-7dba-4a4c-9b9b-20eb9d801e0a-kube-api-access-dqpz7\") pod \"rabbitmq-cluster-operator-779fc9694b-xnjdm\" (UID: \"812dc418-7dba-4a4c-9b9b-20eb9d801e0a\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-xnjdm" Oct 02 09:48:59 crc kubenswrapper[4722]: I1002 09:48:59.741925 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-xnjdm" Oct 02 09:49:00 crc kubenswrapper[4722]: I1002 09:49:00.156559 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-xnjdm"] Oct 02 09:49:01 crc kubenswrapper[4722]: I1002 09:49:01.161915 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-xnjdm" event={"ID":"812dc418-7dba-4a4c-9b9b-20eb9d801e0a","Type":"ContainerStarted","Data":"ebcea3482798ec98eff97b76a89ec30a74a1864e6b879d2dd746409a1271bf4b"} Oct 02 09:49:03 crc kubenswrapper[4722]: I1002 09:49:03.173593 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-xnjdm" event={"ID":"812dc418-7dba-4a4c-9b9b-20eb9d801e0a","Type":"ContainerStarted","Data":"dd0e528a64dd30055aca5481b52fb487d2d1edd031e1e39e9336fedfe6148834"} Oct 02 09:49:03 crc kubenswrapper[4722]: I1002 09:49:03.189116 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-xnjdm" podStartSLOduration=1.506220389 podStartE2EDuration="4.189099572s" podCreationTimestamp="2025-10-02 09:48:59 +0000 UTC" firstStartedPulling="2025-10-02 09:49:00.166640994 +0000 UTC m=+996.203393974" lastFinishedPulling="2025-10-02 09:49:02.849520177 +0000 UTC m=+998.886273157" observedRunningTime="2025-10-02 09:49:03.186463086 +0000 UTC m=+999.223216086" watchObservedRunningTime="2025-10-02 09:49:03.189099572 +0000 UTC m=+999.225852552" Oct 02 09:49:09 crc kubenswrapper[4722]: I1002 09:49:09.543127 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/rabbitmq-server-0"] Oct 02 09:49:09 crc kubenswrapper[4722]: I1002 09:49:09.544438 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/rabbitmq-server-0" Oct 02 09:49:09 crc kubenswrapper[4722]: I1002 09:49:09.546479 4722 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"rabbitmq-default-user" Oct 02 09:49:09 crc kubenswrapper[4722]: I1002 09:49:09.546829 4722 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"rabbitmq-server-dockercfg-jdpln" Oct 02 09:49:09 crc kubenswrapper[4722]: I1002 09:49:09.547028 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"barbican-kuttl-tests"/"rabbitmq-server-conf" Oct 02 09:49:09 crc kubenswrapper[4722]: I1002 09:49:09.547171 4722 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"rabbitmq-erlang-cookie" Oct 02 09:49:09 crc kubenswrapper[4722]: I1002 09:49:09.549636 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"barbican-kuttl-tests"/"rabbitmq-plugins-conf" Oct 02 09:49:09 crc kubenswrapper[4722]: I1002 09:49:09.560934 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/rabbitmq-server-0"] Oct 02 09:49:09 crc kubenswrapper[4722]: I1002 09:49:09.605834 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/81c18fc8-43c0-4993-b297-c158ba94f076-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"81c18fc8-43c0-4993-b297-c158ba94f076\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Oct 02 09:49:09 crc kubenswrapper[4722]: I1002 09:49:09.605909 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/81c18fc8-43c0-4993-b297-c158ba94f076-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"81c18fc8-43c0-4993-b297-c158ba94f076\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Oct 02 09:49:09 crc kubenswrapper[4722]: I1002 09:49:09.605947 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/81c18fc8-43c0-4993-b297-c158ba94f076-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"81c18fc8-43c0-4993-b297-c158ba94f076\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Oct 02 09:49:09 crc kubenswrapper[4722]: I1002 09:49:09.605970 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cf9cg\" (UniqueName: \"kubernetes.io/projected/81c18fc8-43c0-4993-b297-c158ba94f076-kube-api-access-cf9cg\") pod \"rabbitmq-server-0\" (UID: \"81c18fc8-43c0-4993-b297-c158ba94f076\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Oct 02 09:49:09 crc kubenswrapper[4722]: I1002 09:49:09.606052 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8640b5e6-1bf2-4f88-af7e-f86af06afa4d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8640b5e6-1bf2-4f88-af7e-f86af06afa4d\") pod \"rabbitmq-server-0\" (UID: \"81c18fc8-43c0-4993-b297-c158ba94f076\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Oct 02 09:49:09 crc kubenswrapper[4722]: I1002 09:49:09.606099 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/81c18fc8-43c0-4993-b297-c158ba94f076-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"81c18fc8-43c0-4993-b297-c158ba94f076\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Oct 02 09:49:09 crc kubenswrapper[4722]: I1002 09:49:09.606124 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/81c18fc8-43c0-4993-b297-c158ba94f076-pod-info\") pod \"rabbitmq-server-0\" (UID: \"81c18fc8-43c0-4993-b297-c158ba94f076\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Oct 02 09:49:09 crc kubenswrapper[4722]: I1002 09:49:09.606199 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/81c18fc8-43c0-4993-b297-c158ba94f076-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"81c18fc8-43c0-4993-b297-c158ba94f076\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Oct 02 09:49:09 crc kubenswrapper[4722]: I1002 09:49:09.707899 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/81c18fc8-43c0-4993-b297-c158ba94f076-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"81c18fc8-43c0-4993-b297-c158ba94f076\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Oct 02 09:49:09 crc kubenswrapper[4722]: I1002 09:49:09.707956 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/81c18fc8-43c0-4993-b297-c158ba94f076-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"81c18fc8-43c0-4993-b297-c158ba94f076\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Oct 02 09:49:09 crc kubenswrapper[4722]: I1002 09:49:09.707980 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/81c18fc8-43c0-4993-b297-c158ba94f076-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"81c18fc8-43c0-4993-b297-c158ba94f076\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Oct 02 09:49:09 crc kubenswrapper[4722]: I1002 09:49:09.708007 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/81c18fc8-43c0-4993-b297-c158ba94f076-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"81c18fc8-43c0-4993-b297-c158ba94f076\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Oct 02 09:49:09 crc kubenswrapper[4722]: I1002 09:49:09.708052 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cf9cg\" (UniqueName: \"kubernetes.io/projected/81c18fc8-43c0-4993-b297-c158ba94f076-kube-api-access-cf9cg\") pod \"rabbitmq-server-0\" (UID: \"81c18fc8-43c0-4993-b297-c158ba94f076\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Oct 02 09:49:09 crc kubenswrapper[4722]: I1002 09:49:09.708112 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8640b5e6-1bf2-4f88-af7e-f86af06afa4d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8640b5e6-1bf2-4f88-af7e-f86af06afa4d\") pod \"rabbitmq-server-0\" (UID: \"81c18fc8-43c0-4993-b297-c158ba94f076\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Oct 02 09:49:09 crc kubenswrapper[4722]: I1002 09:49:09.708137 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/81c18fc8-43c0-4993-b297-c158ba94f076-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"81c18fc8-43c0-4993-b297-c158ba94f076\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Oct 02 09:49:09 crc kubenswrapper[4722]: I1002 09:49:09.708154 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/81c18fc8-43c0-4993-b297-c158ba94f076-pod-info\") pod \"rabbitmq-server-0\" (UID: \"81c18fc8-43c0-4993-b297-c158ba94f076\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Oct 02 09:49:09 crc kubenswrapper[4722]: I1002 09:49:09.708890 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/81c18fc8-43c0-4993-b297-c158ba94f076-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"81c18fc8-43c0-4993-b297-c158ba94f076\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Oct 02 09:49:09 crc kubenswrapper[4722]: I1002 09:49:09.709886 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/81c18fc8-43c0-4993-b297-c158ba94f076-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"81c18fc8-43c0-4993-b297-c158ba94f076\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Oct 02 09:49:09 crc kubenswrapper[4722]: I1002 09:49:09.710981 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/81c18fc8-43c0-4993-b297-c158ba94f076-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"81c18fc8-43c0-4993-b297-c158ba94f076\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Oct 02 09:49:09 crc kubenswrapper[4722]: I1002 09:49:09.717091 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/81c18fc8-43c0-4993-b297-c158ba94f076-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"81c18fc8-43c0-4993-b297-c158ba94f076\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Oct 02 09:49:09 crc kubenswrapper[4722]: I1002 09:49:09.717495 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/81c18fc8-43c0-4993-b297-c158ba94f076-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"81c18fc8-43c0-4993-b297-c158ba94f076\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Oct 02 09:49:09 crc kubenswrapper[4722]: I1002 09:49:09.720898 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/81c18fc8-43c0-4993-b297-c158ba94f076-pod-info\") pod \"rabbitmq-server-0\" (UID: \"81c18fc8-43c0-4993-b297-c158ba94f076\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Oct 02 09:49:09 crc kubenswrapper[4722]: I1002 09:49:09.731867 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 09:49:09 crc kubenswrapper[4722]: I1002 09:49:09.731914 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8640b5e6-1bf2-4f88-af7e-f86af06afa4d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8640b5e6-1bf2-4f88-af7e-f86af06afa4d\") pod \"rabbitmq-server-0\" (UID: \"81c18fc8-43c0-4993-b297-c158ba94f076\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/59724ab17fe769096169fd1b3d016e1564b53c3cf8b307c584c0d7044fa09fa3/globalmount\"" pod="barbican-kuttl-tests/rabbitmq-server-0" Oct 02 09:49:09 crc kubenswrapper[4722]: I1002 09:49:09.747233 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cf9cg\" (UniqueName: \"kubernetes.io/projected/81c18fc8-43c0-4993-b297-c158ba94f076-kube-api-access-cf9cg\") pod \"rabbitmq-server-0\" (UID: \"81c18fc8-43c0-4993-b297-c158ba94f076\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Oct 02 09:49:09 crc kubenswrapper[4722]: I1002 09:49:09.852907 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8640b5e6-1bf2-4f88-af7e-f86af06afa4d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8640b5e6-1bf2-4f88-af7e-f86af06afa4d\") pod \"rabbitmq-server-0\" (UID: \"81c18fc8-43c0-4993-b297-c158ba94f076\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Oct 02 09:49:09 crc kubenswrapper[4722]: I1002 09:49:09.866852 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/rabbitmq-server-0" Oct 02 09:49:10 crc kubenswrapper[4722]: I1002 09:49:10.081189 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/rabbitmq-server-0"] Oct 02 09:49:10 crc kubenswrapper[4722]: I1002 09:49:10.213728 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/rabbitmq-server-0" event={"ID":"81c18fc8-43c0-4993-b297-c158ba94f076","Type":"ContainerStarted","Data":"b81787c02edc6462cb4ab799ca839e49e9429d7e87bd5eea9025347b98d6fca8"} Oct 02 09:49:10 crc kubenswrapper[4722]: I1002 09:49:10.696827 4722 patch_prober.go:28] interesting pod/machine-config-daemon-q6h76 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 09:49:10 crc kubenswrapper[4722]: I1002 09:49:10.696912 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" podUID="f662826c-e772-4f56-a85f-83971aaef356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 09:49:18 crc kubenswrapper[4722]: I1002 09:49:18.284230 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/rabbitmq-server-0" event={"ID":"81c18fc8-43c0-4993-b297-c158ba94f076","Type":"ContainerStarted","Data":"f03e2f12afe519ad73605106d743cbd0a8188c1a33eeb5a0dd817cce98eaa52d"} Oct 02 09:49:19 crc kubenswrapper[4722]: I1002 09:49:19.482257 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-index-2scxc"] Oct 02 09:49:19 crc kubenswrapper[4722]: I1002 09:49:19.483212 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-2scxc" Oct 02 09:49:19 crc kubenswrapper[4722]: I1002 09:49:19.488597 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-index-dockercfg-w2tgh" Oct 02 09:49:19 crc kubenswrapper[4722]: I1002 09:49:19.492390 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-2scxc"] Oct 02 09:49:19 crc kubenswrapper[4722]: I1002 09:49:19.639483 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52ffs\" (UniqueName: \"kubernetes.io/projected/d3316707-b1db-402e-8a03-f123186c568f-kube-api-access-52ffs\") pod \"keystone-operator-index-2scxc\" (UID: \"d3316707-b1db-402e-8a03-f123186c568f\") " pod="openstack-operators/keystone-operator-index-2scxc" Oct 02 09:49:19 crc kubenswrapper[4722]: I1002 09:49:19.741097 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52ffs\" (UniqueName: \"kubernetes.io/projected/d3316707-b1db-402e-8a03-f123186c568f-kube-api-access-52ffs\") pod \"keystone-operator-index-2scxc\" (UID: \"d3316707-b1db-402e-8a03-f123186c568f\") " pod="openstack-operators/keystone-operator-index-2scxc" Oct 02 09:49:19 crc kubenswrapper[4722]: I1002 09:49:19.762994 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52ffs\" (UniqueName: \"kubernetes.io/projected/d3316707-b1db-402e-8a03-f123186c568f-kube-api-access-52ffs\") pod \"keystone-operator-index-2scxc\" (UID: \"d3316707-b1db-402e-8a03-f123186c568f\") " pod="openstack-operators/keystone-operator-index-2scxc" Oct 02 09:49:19 crc kubenswrapper[4722]: I1002 09:49:19.803003 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-2scxc" Oct 02 09:49:20 crc kubenswrapper[4722]: I1002 09:49:20.008848 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-2scxc"] Oct 02 09:49:20 crc kubenswrapper[4722]: W1002 09:49:20.015335 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3316707_b1db_402e_8a03_f123186c568f.slice/crio-4a3518982701a61286b22b16cc1c49ae2157c39759ec52604f09ce7f58830a0c WatchSource:0}: Error finding container 4a3518982701a61286b22b16cc1c49ae2157c39759ec52604f09ce7f58830a0c: Status 404 returned error can't find the container with id 4a3518982701a61286b22b16cc1c49ae2157c39759ec52604f09ce7f58830a0c Oct 02 09:49:20 crc kubenswrapper[4722]: I1002 09:49:20.298665 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-2scxc" event={"ID":"d3316707-b1db-402e-8a03-f123186c568f","Type":"ContainerStarted","Data":"4a3518982701a61286b22b16cc1c49ae2157c39759ec52604f09ce7f58830a0c"} Oct 02 09:49:22 crc kubenswrapper[4722]: I1002 09:49:22.311767 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-2scxc" event={"ID":"d3316707-b1db-402e-8a03-f123186c568f","Type":"ContainerStarted","Data":"50614909bb6dff7d9471ce934d998eeec3f579907f88b77935c51ab477d689cb"} Oct 02 09:49:22 crc kubenswrapper[4722]: I1002 09:49:22.325439 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-index-2scxc" podStartSLOduration=1.675103593 podStartE2EDuration="3.325419936s" podCreationTimestamp="2025-10-02 09:49:19 +0000 UTC" firstStartedPulling="2025-10-02 09:49:20.018276297 +0000 UTC m=+1016.055029277" lastFinishedPulling="2025-10-02 09:49:21.66859264 +0000 UTC m=+1017.705345620" observedRunningTime="2025-10-02 09:49:22.32436117 +0000 UTC m=+1018.361114160" watchObservedRunningTime="2025-10-02 09:49:22.325419936 +0000 UTC m=+1018.362172916" Oct 02 09:49:24 crc kubenswrapper[4722]: I1002 09:49:24.070221 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-2scxc"] Oct 02 09:49:24 crc kubenswrapper[4722]: I1002 09:49:24.322755 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-index-2scxc" podUID="d3316707-b1db-402e-8a03-f123186c568f" containerName="registry-server" containerID="cri-o://50614909bb6dff7d9471ce934d998eeec3f579907f88b77935c51ab477d689cb" gracePeriod=2 Oct 02 09:49:24 crc kubenswrapper[4722]: I1002 09:49:24.680412 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-index-zldfm"] Oct 02 09:49:24 crc kubenswrapper[4722]: I1002 09:49:24.682202 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-zldfm" Oct 02 09:49:24 crc kubenswrapper[4722]: I1002 09:49:24.691838 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-zldfm"] Oct 02 09:49:24 crc kubenswrapper[4722]: I1002 09:49:24.695237 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-2scxc" Oct 02 09:49:24 crc kubenswrapper[4722]: I1002 09:49:24.810160 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52ffs\" (UniqueName: \"kubernetes.io/projected/d3316707-b1db-402e-8a03-f123186c568f-kube-api-access-52ffs\") pod \"d3316707-b1db-402e-8a03-f123186c568f\" (UID: \"d3316707-b1db-402e-8a03-f123186c568f\") " Oct 02 09:49:24 crc kubenswrapper[4722]: I1002 09:49:24.811337 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j595\" (UniqueName: \"kubernetes.io/projected/d5efc7dc-5ee4-4e70-ad78-acd076852fcb-kube-api-access-7j595\") pod \"keystone-operator-index-zldfm\" (UID: \"d5efc7dc-5ee4-4e70-ad78-acd076852fcb\") " pod="openstack-operators/keystone-operator-index-zldfm" Oct 02 09:49:24 crc kubenswrapper[4722]: I1002 09:49:24.824055 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3316707-b1db-402e-8a03-f123186c568f-kube-api-access-52ffs" (OuterVolumeSpecName: "kube-api-access-52ffs") pod "d3316707-b1db-402e-8a03-f123186c568f" (UID: "d3316707-b1db-402e-8a03-f123186c568f"). InnerVolumeSpecName "kube-api-access-52ffs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:49:24 crc kubenswrapper[4722]: I1002 09:49:24.912257 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j595\" (UniqueName: \"kubernetes.io/projected/d5efc7dc-5ee4-4e70-ad78-acd076852fcb-kube-api-access-7j595\") pod \"keystone-operator-index-zldfm\" (UID: \"d5efc7dc-5ee4-4e70-ad78-acd076852fcb\") " pod="openstack-operators/keystone-operator-index-zldfm" Oct 02 09:49:24 crc kubenswrapper[4722]: I1002 09:49:24.912402 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52ffs\" (UniqueName: \"kubernetes.io/projected/d3316707-b1db-402e-8a03-f123186c568f-kube-api-access-52ffs\") on node \"crc\" DevicePath \"\"" Oct 02 09:49:24 crc kubenswrapper[4722]: I1002 09:49:24.932041 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j595\" (UniqueName: \"kubernetes.io/projected/d5efc7dc-5ee4-4e70-ad78-acd076852fcb-kube-api-access-7j595\") pod \"keystone-operator-index-zldfm\" (UID: \"d5efc7dc-5ee4-4e70-ad78-acd076852fcb\") " pod="openstack-operators/keystone-operator-index-zldfm" Oct 02 09:49:25 crc kubenswrapper[4722]: I1002 09:49:25.010093 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-zldfm" Oct 02 09:49:25 crc kubenswrapper[4722]: I1002 09:49:25.236452 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-zldfm"] Oct 02 09:49:25 crc kubenswrapper[4722]: W1002 09:49:25.244163 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5efc7dc_5ee4_4e70_ad78_acd076852fcb.slice/crio-a0012f15a55016abbbd2a428e70886543be73b122284d4bc316f32d32814b6d7 WatchSource:0}: Error finding container a0012f15a55016abbbd2a428e70886543be73b122284d4bc316f32d32814b6d7: Status 404 returned error can't find the container with id a0012f15a55016abbbd2a428e70886543be73b122284d4bc316f32d32814b6d7 Oct 02 09:49:25 crc kubenswrapper[4722]: I1002 09:49:25.330104 4722 generic.go:334] "Generic (PLEG): container finished" podID="d3316707-b1db-402e-8a03-f123186c568f" containerID="50614909bb6dff7d9471ce934d998eeec3f579907f88b77935c51ab477d689cb" exitCode=0 Oct 02 09:49:25 crc kubenswrapper[4722]: I1002 09:49:25.330165 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-2scxc" event={"ID":"d3316707-b1db-402e-8a03-f123186c568f","Type":"ContainerDied","Data":"50614909bb6dff7d9471ce934d998eeec3f579907f88b77935c51ab477d689cb"} Oct 02 09:49:25 crc kubenswrapper[4722]: I1002 09:49:25.330194 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-2scxc" event={"ID":"d3316707-b1db-402e-8a03-f123186c568f","Type":"ContainerDied","Data":"4a3518982701a61286b22b16cc1c49ae2157c39759ec52604f09ce7f58830a0c"} Oct 02 09:49:25 crc kubenswrapper[4722]: I1002 09:49:25.330210 4722 scope.go:117] "RemoveContainer" containerID="50614909bb6dff7d9471ce934d998eeec3f579907f88b77935c51ab477d689cb" Oct 02 09:49:25 crc kubenswrapper[4722]: I1002 09:49:25.330298 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-2scxc" Oct 02 09:49:25 crc kubenswrapper[4722]: I1002 09:49:25.334632 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-zldfm" event={"ID":"d5efc7dc-5ee4-4e70-ad78-acd076852fcb","Type":"ContainerStarted","Data":"a0012f15a55016abbbd2a428e70886543be73b122284d4bc316f32d32814b6d7"} Oct 02 09:49:25 crc kubenswrapper[4722]: I1002 09:49:25.352142 4722 scope.go:117] "RemoveContainer" containerID="50614909bb6dff7d9471ce934d998eeec3f579907f88b77935c51ab477d689cb" Oct 02 09:49:25 crc kubenswrapper[4722]: E1002 09:49:25.353413 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50614909bb6dff7d9471ce934d998eeec3f579907f88b77935c51ab477d689cb\": container with ID starting with 50614909bb6dff7d9471ce934d998eeec3f579907f88b77935c51ab477d689cb not found: ID does not exist" containerID="50614909bb6dff7d9471ce934d998eeec3f579907f88b77935c51ab477d689cb" Oct 02 09:49:25 crc kubenswrapper[4722]: I1002 09:49:25.353457 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50614909bb6dff7d9471ce934d998eeec3f579907f88b77935c51ab477d689cb"} err="failed to get container status \"50614909bb6dff7d9471ce934d998eeec3f579907f88b77935c51ab477d689cb\": rpc error: code = NotFound desc = could not find container \"50614909bb6dff7d9471ce934d998eeec3f579907f88b77935c51ab477d689cb\": container with ID starting with 50614909bb6dff7d9471ce934d998eeec3f579907f88b77935c51ab477d689cb not found: ID does not exist" Oct 02 09:49:25 crc kubenswrapper[4722]: I1002 09:49:25.364031 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-2scxc"] Oct 02 09:49:25 crc kubenswrapper[4722]: I1002 09:49:25.367614 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-index-2scxc"] Oct 02 09:49:26 crc kubenswrapper[4722]: I1002 09:49:26.719664 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3316707-b1db-402e-8a03-f123186c568f" path="/var/lib/kubelet/pods/d3316707-b1db-402e-8a03-f123186c568f/volumes" Oct 02 09:49:27 crc kubenswrapper[4722]: I1002 09:49:27.347956 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-zldfm" event={"ID":"d5efc7dc-5ee4-4e70-ad78-acd076852fcb","Type":"ContainerStarted","Data":"c3826aec3d73b676886a02ab3e280a6ccdfaf25594251c6b91493a857d5d2659"} Oct 02 09:49:27 crc kubenswrapper[4722]: I1002 09:49:27.372501 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-index-zldfm" podStartSLOduration=2.237651245 podStartE2EDuration="3.372474764s" podCreationTimestamp="2025-10-02 09:49:24 +0000 UTC" firstStartedPulling="2025-10-02 09:49:25.248655808 +0000 UTC m=+1021.285408788" lastFinishedPulling="2025-10-02 09:49:26.383479327 +0000 UTC m=+1022.420232307" observedRunningTime="2025-10-02 09:49:27.370059514 +0000 UTC m=+1023.406812534" watchObservedRunningTime="2025-10-02 09:49:27.372474764 +0000 UTC m=+1023.409227744" Oct 02 09:49:35 crc kubenswrapper[4722]: I1002 09:49:35.010268 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/keystone-operator-index-zldfm" Oct 02 09:49:35 crc kubenswrapper[4722]: I1002 09:49:35.010800 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-index-zldfm" Oct 02 09:49:35 crc kubenswrapper[4722]: I1002 09:49:35.039278 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/keystone-operator-index-zldfm" Oct 02 09:49:35 crc kubenswrapper[4722]: I1002 09:49:35.420024 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-index-zldfm" Oct 02 09:49:37 crc kubenswrapper[4722]: I1002 09:49:37.109479 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436n66v9"] Oct 02 09:49:37 crc kubenswrapper[4722]: E1002 09:49:37.109751 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3316707-b1db-402e-8a03-f123186c568f" containerName="registry-server" Oct 02 09:49:37 crc kubenswrapper[4722]: I1002 09:49:37.109767 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3316707-b1db-402e-8a03-f123186c568f" containerName="registry-server" Oct 02 09:49:37 crc kubenswrapper[4722]: I1002 09:49:37.109905 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3316707-b1db-402e-8a03-f123186c568f" containerName="registry-server" Oct 02 09:49:37 crc kubenswrapper[4722]: I1002 09:49:37.110925 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436n66v9" Oct 02 09:49:37 crc kubenswrapper[4722]: I1002 09:49:37.115058 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-bc9s8" Oct 02 09:49:37 crc kubenswrapper[4722]: I1002 09:49:37.116680 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436n66v9"] Oct 02 09:49:37 crc kubenswrapper[4722]: I1002 09:49:37.192434 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ba9a9776-4156-4177-a093-f358250b4552-bundle\") pod \"d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436n66v9\" (UID: \"ba9a9776-4156-4177-a093-f358250b4552\") " pod="openstack-operators/d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436n66v9" Oct 02 09:49:37 crc kubenswrapper[4722]: I1002 09:49:37.192499 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzqx4\" (UniqueName: \"kubernetes.io/projected/ba9a9776-4156-4177-a093-f358250b4552-kube-api-access-mzqx4\") pod \"d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436n66v9\" (UID: \"ba9a9776-4156-4177-a093-f358250b4552\") " pod="openstack-operators/d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436n66v9" Oct 02 09:49:37 crc kubenswrapper[4722]: I1002 09:49:37.192543 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ba9a9776-4156-4177-a093-f358250b4552-util\") pod \"d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436n66v9\" (UID: \"ba9a9776-4156-4177-a093-f358250b4552\") " pod="openstack-operators/d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436n66v9" Oct 02 09:49:37 crc kubenswrapper[4722]: I1002 09:49:37.293403 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ba9a9776-4156-4177-a093-f358250b4552-bundle\") pod \"d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436n66v9\" (UID: \"ba9a9776-4156-4177-a093-f358250b4552\") " pod="openstack-operators/d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436n66v9" Oct 02 09:49:37 crc kubenswrapper[4722]: I1002 09:49:37.293485 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzqx4\" (UniqueName: \"kubernetes.io/projected/ba9a9776-4156-4177-a093-f358250b4552-kube-api-access-mzqx4\") pod \"d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436n66v9\" (UID: \"ba9a9776-4156-4177-a093-f358250b4552\") " pod="openstack-operators/d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436n66v9" Oct 02 09:49:37 crc kubenswrapper[4722]: I1002 09:49:37.293531 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ba9a9776-4156-4177-a093-f358250b4552-util\") pod \"d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436n66v9\" (UID: \"ba9a9776-4156-4177-a093-f358250b4552\") " pod="openstack-operators/d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436n66v9" Oct 02 09:49:37 crc kubenswrapper[4722]: I1002 09:49:37.293903 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ba9a9776-4156-4177-a093-f358250b4552-bundle\") pod \"d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436n66v9\" (UID: \"ba9a9776-4156-4177-a093-f358250b4552\") " pod="openstack-operators/d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436n66v9" Oct 02 09:49:37 crc kubenswrapper[4722]: I1002 09:49:37.293934 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ba9a9776-4156-4177-a093-f358250b4552-util\") pod \"d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436n66v9\" (UID: \"ba9a9776-4156-4177-a093-f358250b4552\") " pod="openstack-operators/d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436n66v9" Oct 02 09:49:37 crc kubenswrapper[4722]: I1002 09:49:37.316995 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzqx4\" (UniqueName: \"kubernetes.io/projected/ba9a9776-4156-4177-a093-f358250b4552-kube-api-access-mzqx4\") pod \"d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436n66v9\" (UID: \"ba9a9776-4156-4177-a093-f358250b4552\") " pod="openstack-operators/d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436n66v9" Oct 02 09:49:37 crc kubenswrapper[4722]: I1002 09:49:37.429927 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436n66v9" Oct 02 09:49:37 crc kubenswrapper[4722]: I1002 09:49:37.845361 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436n66v9"] Oct 02 09:49:37 crc kubenswrapper[4722]: W1002 09:49:37.852947 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba9a9776_4156_4177_a093_f358250b4552.slice/crio-c56fa5716a0fd44a7dc58a98389dc2bc7aa102aa32bfd3d633549bca5ce135b0 WatchSource:0}: Error finding container c56fa5716a0fd44a7dc58a98389dc2bc7aa102aa32bfd3d633549bca5ce135b0: Status 404 returned error can't find the container with id c56fa5716a0fd44a7dc58a98389dc2bc7aa102aa32bfd3d633549bca5ce135b0 Oct 02 09:49:38 crc kubenswrapper[4722]: I1002 09:49:38.415558 4722 generic.go:334] "Generic (PLEG): container finished" podID="ba9a9776-4156-4177-a093-f358250b4552" containerID="72e6cb9e7b3a51db3544adb68d8fca6905cc97b68c242ef8da4a19cd9f479b97" exitCode=0 Oct 02 09:49:38 crc kubenswrapper[4722]: I1002 09:49:38.415606 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436n66v9" event={"ID":"ba9a9776-4156-4177-a093-f358250b4552","Type":"ContainerDied","Data":"72e6cb9e7b3a51db3544adb68d8fca6905cc97b68c242ef8da4a19cd9f479b97"} Oct 02 09:49:38 crc kubenswrapper[4722]: I1002 09:49:38.415657 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436n66v9" event={"ID":"ba9a9776-4156-4177-a093-f358250b4552","Type":"ContainerStarted","Data":"c56fa5716a0fd44a7dc58a98389dc2bc7aa102aa32bfd3d633549bca5ce135b0"} Oct 02 09:49:39 crc kubenswrapper[4722]: I1002 09:49:39.440759 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436n66v9" event={"ID":"ba9a9776-4156-4177-a093-f358250b4552","Type":"ContainerStarted","Data":"4642f406934c0358bbd514277de06a9df9a755c6bd1a976ec91ea2a636cbf1b7"} Oct 02 09:49:40 crc kubenswrapper[4722]: I1002 09:49:40.449509 4722 generic.go:334] "Generic (PLEG): container finished" podID="ba9a9776-4156-4177-a093-f358250b4552" containerID="4642f406934c0358bbd514277de06a9df9a755c6bd1a976ec91ea2a636cbf1b7" exitCode=0 Oct 02 09:49:40 crc kubenswrapper[4722]: I1002 09:49:40.449552 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436n66v9" event={"ID":"ba9a9776-4156-4177-a093-f358250b4552","Type":"ContainerDied","Data":"4642f406934c0358bbd514277de06a9df9a755c6bd1a976ec91ea2a636cbf1b7"} Oct 02 09:49:40 crc kubenswrapper[4722]: I1002 09:49:40.697168 4722 patch_prober.go:28] interesting pod/machine-config-daemon-q6h76 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 09:49:40 crc kubenswrapper[4722]: I1002 09:49:40.697580 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" podUID="f662826c-e772-4f56-a85f-83971aaef356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 09:49:40 crc kubenswrapper[4722]: I1002 09:49:40.697639 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" Oct 02 09:49:40 crc kubenswrapper[4722]: I1002 09:49:40.698377 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"424c2680b223e9db678ffaf1f8b8a1f4cab14a156d3b1044d8be0e10fbad104c"} pod="openshift-machine-config-operator/machine-config-daemon-q6h76" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 09:49:40 crc kubenswrapper[4722]: I1002 09:49:40.698443 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" podUID="f662826c-e772-4f56-a85f-83971aaef356" containerName="machine-config-daemon" containerID="cri-o://424c2680b223e9db678ffaf1f8b8a1f4cab14a156d3b1044d8be0e10fbad104c" gracePeriod=600 Oct 02 09:49:41 crc kubenswrapper[4722]: I1002 09:49:41.457920 4722 generic.go:334] "Generic (PLEG): container finished" podID="f662826c-e772-4f56-a85f-83971aaef356" containerID="424c2680b223e9db678ffaf1f8b8a1f4cab14a156d3b1044d8be0e10fbad104c" exitCode=0 Oct 02 09:49:41 crc kubenswrapper[4722]: I1002 09:49:41.457990 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" event={"ID":"f662826c-e772-4f56-a85f-83971aaef356","Type":"ContainerDied","Data":"424c2680b223e9db678ffaf1f8b8a1f4cab14a156d3b1044d8be0e10fbad104c"} Oct 02 09:49:41 crc kubenswrapper[4722]: I1002 09:49:41.459208 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" event={"ID":"f662826c-e772-4f56-a85f-83971aaef356","Type":"ContainerStarted","Data":"c62ee09a3389642011d46d3c51dac1ea6cb21564e2a50b3f560683d644ff19e5"} Oct 02 09:49:41 crc kubenswrapper[4722]: I1002 09:49:41.459252 4722 scope.go:117] "RemoveContainer" containerID="2e23a92796174868d03a49e3fa647f65a4cb15be5ec415c3eed20693726accb6" Oct 02 09:49:41 crc kubenswrapper[4722]: I1002 09:49:41.462723 4722 generic.go:334] "Generic (PLEG): container finished" podID="ba9a9776-4156-4177-a093-f358250b4552" containerID="ed3d756892bd016811916bf95d762940d1155429d135e9473d4c5dce7772a53d" exitCode=0 Oct 02 09:49:41 crc kubenswrapper[4722]: I1002 09:49:41.462826 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436n66v9" event={"ID":"ba9a9776-4156-4177-a093-f358250b4552","Type":"ContainerDied","Data":"ed3d756892bd016811916bf95d762940d1155429d135e9473d4c5dce7772a53d"} Oct 02 09:49:42 crc kubenswrapper[4722]: I1002 09:49:42.742749 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436n66v9" Oct 02 09:49:42 crc kubenswrapper[4722]: I1002 09:49:42.869627 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzqx4\" (UniqueName: \"kubernetes.io/projected/ba9a9776-4156-4177-a093-f358250b4552-kube-api-access-mzqx4\") pod \"ba9a9776-4156-4177-a093-f358250b4552\" (UID: \"ba9a9776-4156-4177-a093-f358250b4552\") " Oct 02 09:49:42 crc kubenswrapper[4722]: I1002 09:49:42.869707 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ba9a9776-4156-4177-a093-f358250b4552-bundle\") pod \"ba9a9776-4156-4177-a093-f358250b4552\" (UID: \"ba9a9776-4156-4177-a093-f358250b4552\") " Oct 02 09:49:42 crc kubenswrapper[4722]: I1002 09:49:42.869748 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ba9a9776-4156-4177-a093-f358250b4552-util\") pod \"ba9a9776-4156-4177-a093-f358250b4552\" (UID: \"ba9a9776-4156-4177-a093-f358250b4552\") " Oct 02 09:49:42 crc kubenswrapper[4722]: I1002 09:49:42.871229 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba9a9776-4156-4177-a093-f358250b4552-bundle" (OuterVolumeSpecName: "bundle") pod "ba9a9776-4156-4177-a093-f358250b4552" (UID: "ba9a9776-4156-4177-a093-f358250b4552"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:49:42 crc kubenswrapper[4722]: I1002 09:49:42.879614 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba9a9776-4156-4177-a093-f358250b4552-kube-api-access-mzqx4" (OuterVolumeSpecName: "kube-api-access-mzqx4") pod "ba9a9776-4156-4177-a093-f358250b4552" (UID: "ba9a9776-4156-4177-a093-f358250b4552"). InnerVolumeSpecName "kube-api-access-mzqx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:49:42 crc kubenswrapper[4722]: I1002 09:49:42.885218 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba9a9776-4156-4177-a093-f358250b4552-util" (OuterVolumeSpecName: "util") pod "ba9a9776-4156-4177-a093-f358250b4552" (UID: "ba9a9776-4156-4177-a093-f358250b4552"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:49:42 crc kubenswrapper[4722]: I1002 09:49:42.971282 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzqx4\" (UniqueName: \"kubernetes.io/projected/ba9a9776-4156-4177-a093-f358250b4552-kube-api-access-mzqx4\") on node \"crc\" DevicePath \"\"" Oct 02 09:49:42 crc kubenswrapper[4722]: I1002 09:49:42.971350 4722 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ba9a9776-4156-4177-a093-f358250b4552-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 09:49:42 crc kubenswrapper[4722]: I1002 09:49:42.971361 4722 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ba9a9776-4156-4177-a093-f358250b4552-util\") on node \"crc\" DevicePath \"\"" Oct 02 09:49:43 crc kubenswrapper[4722]: I1002 09:49:43.482330 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436n66v9" event={"ID":"ba9a9776-4156-4177-a093-f358250b4552","Type":"ContainerDied","Data":"c56fa5716a0fd44a7dc58a98389dc2bc7aa102aa32bfd3d633549bca5ce135b0"} Oct 02 09:49:43 crc kubenswrapper[4722]: I1002 09:49:43.482656 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c56fa5716a0fd44a7dc58a98389dc2bc7aa102aa32bfd3d633549bca5ce135b0" Oct 02 09:49:43 crc kubenswrapper[4722]: I1002 09:49:43.482395 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436n66v9" Oct 02 09:49:50 crc kubenswrapper[4722]: I1002 09:49:50.529410 4722 generic.go:334] "Generic (PLEG): container finished" podID="81c18fc8-43c0-4993-b297-c158ba94f076" containerID="f03e2f12afe519ad73605106d743cbd0a8188c1a33eeb5a0dd817cce98eaa52d" exitCode=0 Oct 02 09:49:50 crc kubenswrapper[4722]: I1002 09:49:50.530270 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/rabbitmq-server-0" event={"ID":"81c18fc8-43c0-4993-b297-c158ba94f076","Type":"ContainerDied","Data":"f03e2f12afe519ad73605106d743cbd0a8188c1a33eeb5a0dd817cce98eaa52d"} Oct 02 09:49:51 crc kubenswrapper[4722]: I1002 09:49:51.415205 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6979dbf55-qtk97"] Oct 02 09:49:51 crc kubenswrapper[4722]: E1002 09:49:51.415890 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba9a9776-4156-4177-a093-f358250b4552" containerName="util" Oct 02 09:49:51 crc kubenswrapper[4722]: I1002 09:49:51.415915 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba9a9776-4156-4177-a093-f358250b4552" containerName="util" Oct 02 09:49:51 crc kubenswrapper[4722]: E1002 09:49:51.415943 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba9a9776-4156-4177-a093-f358250b4552" containerName="pull" Oct 02 09:49:51 crc kubenswrapper[4722]: I1002 09:49:51.415953 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba9a9776-4156-4177-a093-f358250b4552" containerName="pull" Oct 02 09:49:51 crc kubenswrapper[4722]: E1002 09:49:51.415965 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba9a9776-4156-4177-a093-f358250b4552" containerName="extract" Oct 02 09:49:51 crc kubenswrapper[4722]: I1002 09:49:51.415973 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba9a9776-4156-4177-a093-f358250b4552" containerName="extract" Oct 02 09:49:51 crc kubenswrapper[4722]: I1002 09:49:51.416120 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba9a9776-4156-4177-a093-f358250b4552" containerName="extract" Oct 02 09:49:51 crc kubenswrapper[4722]: I1002 09:49:51.416902 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-6979dbf55-qtk97" Oct 02 09:49:51 crc kubenswrapper[4722]: I1002 09:49:51.418545 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-service-cert" Oct 02 09:49:51 crc kubenswrapper[4722]: I1002 09:49:51.418695 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-hngcl" Oct 02 09:49:51 crc kubenswrapper[4722]: I1002 09:49:51.432483 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6979dbf55-qtk97"] Oct 02 09:49:51 crc kubenswrapper[4722]: I1002 09:49:51.540459 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/rabbitmq-server-0" event={"ID":"81c18fc8-43c0-4993-b297-c158ba94f076","Type":"ContainerStarted","Data":"f933ea6cacf864c31c7efc116b8e3e8a82c82fffa11d020ef9b1c9222164e3fb"} Oct 02 09:49:51 crc kubenswrapper[4722]: I1002 09:49:51.540689 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/rabbitmq-server-0" Oct 02 09:49:51 crc kubenswrapper[4722]: I1002 09:49:51.581965 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/rabbitmq-server-0" podStartSLOduration=36.737259942 podStartE2EDuration="43.581943182s" podCreationTimestamp="2025-10-02 09:49:08 +0000 UTC" firstStartedPulling="2025-10-02 09:49:10.100772609 +0000 UTC m=+1006.137525589" lastFinishedPulling="2025-10-02 09:49:16.945455849 +0000 UTC m=+1012.982208829" observedRunningTime="2025-10-02 09:49:51.575354028 +0000 UTC m=+1047.612107018" watchObservedRunningTime="2025-10-02 09:49:51.581943182 +0000 UTC m=+1047.618696162" Oct 02 09:49:51 crc kubenswrapper[4722]: I1002 09:49:51.592261 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/df9a46f7-6264-4bc4-b5f2-87a84a38a06c-apiservice-cert\") pod \"keystone-operator-controller-manager-6979dbf55-qtk97\" (UID: \"df9a46f7-6264-4bc4-b5f2-87a84a38a06c\") " pod="openstack-operators/keystone-operator-controller-manager-6979dbf55-qtk97" Oct 02 09:49:51 crc kubenswrapper[4722]: I1002 09:49:51.592345 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/df9a46f7-6264-4bc4-b5f2-87a84a38a06c-webhook-cert\") pod \"keystone-operator-controller-manager-6979dbf55-qtk97\" (UID: \"df9a46f7-6264-4bc4-b5f2-87a84a38a06c\") " pod="openstack-operators/keystone-operator-controller-manager-6979dbf55-qtk97" Oct 02 09:49:51 crc kubenswrapper[4722]: I1002 09:49:51.592545 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8bmx\" (UniqueName: \"kubernetes.io/projected/df9a46f7-6264-4bc4-b5f2-87a84a38a06c-kube-api-access-t8bmx\") pod \"keystone-operator-controller-manager-6979dbf55-qtk97\" (UID: \"df9a46f7-6264-4bc4-b5f2-87a84a38a06c\") " pod="openstack-operators/keystone-operator-controller-manager-6979dbf55-qtk97" Oct 02 09:49:51 crc kubenswrapper[4722]: I1002 09:49:51.694287 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/df9a46f7-6264-4bc4-b5f2-87a84a38a06c-webhook-cert\") pod \"keystone-operator-controller-manager-6979dbf55-qtk97\" (UID: \"df9a46f7-6264-4bc4-b5f2-87a84a38a06c\") " pod="openstack-operators/keystone-operator-controller-manager-6979dbf55-qtk97" Oct 02 09:49:51 crc kubenswrapper[4722]: I1002 09:49:51.694776 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8bmx\" (UniqueName: \"kubernetes.io/projected/df9a46f7-6264-4bc4-b5f2-87a84a38a06c-kube-api-access-t8bmx\") pod \"keystone-operator-controller-manager-6979dbf55-qtk97\" (UID: \"df9a46f7-6264-4bc4-b5f2-87a84a38a06c\") " pod="openstack-operators/keystone-operator-controller-manager-6979dbf55-qtk97" Oct 02 09:49:51 crc kubenswrapper[4722]: I1002 09:49:51.694993 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/df9a46f7-6264-4bc4-b5f2-87a84a38a06c-apiservice-cert\") pod \"keystone-operator-controller-manager-6979dbf55-qtk97\" (UID: \"df9a46f7-6264-4bc4-b5f2-87a84a38a06c\") " pod="openstack-operators/keystone-operator-controller-manager-6979dbf55-qtk97" Oct 02 09:49:51 crc kubenswrapper[4722]: I1002 09:49:51.702205 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/df9a46f7-6264-4bc4-b5f2-87a84a38a06c-webhook-cert\") pod \"keystone-operator-controller-manager-6979dbf55-qtk97\" (UID: \"df9a46f7-6264-4bc4-b5f2-87a84a38a06c\") " pod="openstack-operators/keystone-operator-controller-manager-6979dbf55-qtk97" Oct 02 09:49:51 crc kubenswrapper[4722]: I1002 09:49:51.702679 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/df9a46f7-6264-4bc4-b5f2-87a84a38a06c-apiservice-cert\") pod \"keystone-operator-controller-manager-6979dbf55-qtk97\" (UID: \"df9a46f7-6264-4bc4-b5f2-87a84a38a06c\") " pod="openstack-operators/keystone-operator-controller-manager-6979dbf55-qtk97" Oct 02 09:49:51 crc kubenswrapper[4722]: I1002 09:49:51.724237 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8bmx\" (UniqueName: \"kubernetes.io/projected/df9a46f7-6264-4bc4-b5f2-87a84a38a06c-kube-api-access-t8bmx\") pod \"keystone-operator-controller-manager-6979dbf55-qtk97\" (UID: \"df9a46f7-6264-4bc4-b5f2-87a84a38a06c\") " pod="openstack-operators/keystone-operator-controller-manager-6979dbf55-qtk97" Oct 02 09:49:51 crc kubenswrapper[4722]: I1002 09:49:51.736175 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-6979dbf55-qtk97" Oct 02 09:49:52 crc kubenswrapper[4722]: I1002 09:49:52.017632 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6979dbf55-qtk97"] Oct 02 09:49:52 crc kubenswrapper[4722]: I1002 09:49:52.549473 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-6979dbf55-qtk97" event={"ID":"df9a46f7-6264-4bc4-b5f2-87a84a38a06c","Type":"ContainerStarted","Data":"e92861c3ac290b36600758812a0faa9cef1c3b06521e25f64674998cadf3895a"} Oct 02 09:49:55 crc kubenswrapper[4722]: I1002 09:49:55.572975 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-6979dbf55-qtk97" event={"ID":"df9a46f7-6264-4bc4-b5f2-87a84a38a06c","Type":"ContainerStarted","Data":"47f380c69b25256dbdac871d4dc8ab43d1779330e735aa503048342a88008a72"} Oct 02 09:49:55 crc kubenswrapper[4722]: I1002 09:49:55.573489 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-6979dbf55-qtk97" Oct 02 09:49:55 crc kubenswrapper[4722]: I1002 09:49:55.573540 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-6979dbf55-qtk97" event={"ID":"df9a46f7-6264-4bc4-b5f2-87a84a38a06c","Type":"ContainerStarted","Data":"049711e4fe09d7f2a7f42fc6efebd5c486068ad4814d1ec7e9eb4c4ad2dd91d0"} Oct 02 09:49:55 crc kubenswrapper[4722]: I1002 09:49:55.593031 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-6979dbf55-qtk97" podStartSLOduration=2.227972609 podStartE2EDuration="4.593011896s" podCreationTimestamp="2025-10-02 09:49:51 +0000 UTC" firstStartedPulling="2025-10-02 09:49:52.027564711 +0000 UTC m=+1048.064317691" lastFinishedPulling="2025-10-02 09:49:54.392603998 +0000 UTC m=+1050.429356978" observedRunningTime="2025-10-02 09:49:55.589633502 +0000 UTC m=+1051.626386482" watchObservedRunningTime="2025-10-02 09:49:55.593011896 +0000 UTC m=+1051.629764876" Oct 02 09:50:01 crc kubenswrapper[4722]: I1002 09:50:01.740444 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-6979dbf55-qtk97" Oct 02 09:50:06 crc kubenswrapper[4722]: I1002 09:50:06.875491 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-index-fsxkj"] Oct 02 09:50:06 crc kubenswrapper[4722]: I1002 09:50:06.876976 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-fsxkj" Oct 02 09:50:06 crc kubenswrapper[4722]: I1002 09:50:06.879040 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-index-dockercfg-sb2br" Oct 02 09:50:06 crc kubenswrapper[4722]: I1002 09:50:06.883217 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-index-fsxkj"] Oct 02 09:50:07 crc kubenswrapper[4722]: I1002 09:50:07.010557 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slsd4\" (UniqueName: \"kubernetes.io/projected/e09d4bc8-436e-4faa-9416-3d76a569c97e-kube-api-access-slsd4\") pod \"barbican-operator-index-fsxkj\" (UID: \"e09d4bc8-436e-4faa-9416-3d76a569c97e\") " pod="openstack-operators/barbican-operator-index-fsxkj" Oct 02 09:50:07 crc kubenswrapper[4722]: I1002 09:50:07.112074 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slsd4\" (UniqueName: \"kubernetes.io/projected/e09d4bc8-436e-4faa-9416-3d76a569c97e-kube-api-access-slsd4\") pod \"barbican-operator-index-fsxkj\" (UID: \"e09d4bc8-436e-4faa-9416-3d76a569c97e\") " pod="openstack-operators/barbican-operator-index-fsxkj" Oct 02 09:50:07 crc kubenswrapper[4722]: I1002 09:50:07.138052 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slsd4\" (UniqueName: \"kubernetes.io/projected/e09d4bc8-436e-4faa-9416-3d76a569c97e-kube-api-access-slsd4\") pod \"barbican-operator-index-fsxkj\" (UID: \"e09d4bc8-436e-4faa-9416-3d76a569c97e\") " pod="openstack-operators/barbican-operator-index-fsxkj" Oct 02 09:50:07 crc kubenswrapper[4722]: I1002 09:50:07.198569 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-fsxkj" Oct 02 09:50:07 crc kubenswrapper[4722]: I1002 09:50:07.662094 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-index-fsxkj"] Oct 02 09:50:07 crc kubenswrapper[4722]: W1002 09:50:07.665708 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode09d4bc8_436e_4faa_9416_3d76a569c97e.slice/crio-14bf8f5a9371cbcb71c3447895a32c6714d38c9a6235ff4c2b4f083d13b6d7a7 WatchSource:0}: Error finding container 14bf8f5a9371cbcb71c3447895a32c6714d38c9a6235ff4c2b4f083d13b6d7a7: Status 404 returned error can't find the container with id 14bf8f5a9371cbcb71c3447895a32c6714d38c9a6235ff4c2b4f083d13b6d7a7 Oct 02 09:50:08 crc kubenswrapper[4722]: I1002 09:50:08.660206 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-fsxkj" event={"ID":"e09d4bc8-436e-4faa-9416-3d76a569c97e","Type":"ContainerStarted","Data":"14bf8f5a9371cbcb71c3447895a32c6714d38c9a6235ff4c2b4f083d13b6d7a7"} Oct 02 09:50:09 crc kubenswrapper[4722]: I1002 09:50:09.875534 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="barbican-kuttl-tests/rabbitmq-server-0" Oct 02 09:50:10 crc kubenswrapper[4722]: I1002 09:50:10.686595 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-fsxkj" event={"ID":"e09d4bc8-436e-4faa-9416-3d76a569c97e","Type":"ContainerStarted","Data":"e8f97d4cfb43093b0435d77d81e8da482d512dd24057fff7dfe5ca7914c33b82"} Oct 02 09:50:10 crc kubenswrapper[4722]: I1002 09:50:10.705716 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-index-fsxkj" podStartSLOduration=2.1782272320000002 podStartE2EDuration="4.705671623s" podCreationTimestamp="2025-10-02 09:50:06 +0000 UTC" firstStartedPulling="2025-10-02 09:50:07.676721034 +0000 UTC m=+1063.713474014" lastFinishedPulling="2025-10-02 09:50:10.204165425 +0000 UTC m=+1066.240918405" observedRunningTime="2025-10-02 09:50:10.700017302 +0000 UTC m=+1066.736770302" watchObservedRunningTime="2025-10-02 09:50:10.705671623 +0000 UTC m=+1066.742424603" Oct 02 09:50:14 crc kubenswrapper[4722]: I1002 09:50:14.339609 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/keystone-db-create-llpsl"] Oct 02 09:50:14 crc kubenswrapper[4722]: I1002 09:50:14.341915 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-db-create-llpsl" Oct 02 09:50:14 crc kubenswrapper[4722]: I1002 09:50:14.348601 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-db-create-llpsl"] Oct 02 09:50:14 crc kubenswrapper[4722]: I1002 09:50:14.414503 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6r72\" (UniqueName: \"kubernetes.io/projected/4715d909-cf7b-4a50-add4-38e3482e6a5a-kube-api-access-h6r72\") pod \"keystone-db-create-llpsl\" (UID: \"4715d909-cf7b-4a50-add4-38e3482e6a5a\") " pod="barbican-kuttl-tests/keystone-db-create-llpsl" Oct 02 09:50:14 crc kubenswrapper[4722]: I1002 09:50:14.515963 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6r72\" (UniqueName: \"kubernetes.io/projected/4715d909-cf7b-4a50-add4-38e3482e6a5a-kube-api-access-h6r72\") pod \"keystone-db-create-llpsl\" (UID: \"4715d909-cf7b-4a50-add4-38e3482e6a5a\") " pod="barbican-kuttl-tests/keystone-db-create-llpsl" Oct 02 09:50:14 crc kubenswrapper[4722]: I1002 09:50:14.534589 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6r72\" (UniqueName: \"kubernetes.io/projected/4715d909-cf7b-4a50-add4-38e3482e6a5a-kube-api-access-h6r72\") pod \"keystone-db-create-llpsl\" (UID: \"4715d909-cf7b-4a50-add4-38e3482e6a5a\") " pod="barbican-kuttl-tests/keystone-db-create-llpsl" Oct 02 09:50:14 crc kubenswrapper[4722]: I1002 09:50:14.658475 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-db-create-llpsl" Oct 02 09:50:15 crc kubenswrapper[4722]: I1002 09:50:15.186724 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-db-create-llpsl"] Oct 02 09:50:15 crc kubenswrapper[4722]: I1002 09:50:15.738927 4722 generic.go:334] "Generic (PLEG): container finished" podID="4715d909-cf7b-4a50-add4-38e3482e6a5a" containerID="8e3ba57210dc8d1c17bd479fd8090305ab6c61257a5e795542328de4a413f4f8" exitCode=0 Oct 02 09:50:15 crc kubenswrapper[4722]: I1002 09:50:15.739004 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-db-create-llpsl" event={"ID":"4715d909-cf7b-4a50-add4-38e3482e6a5a","Type":"ContainerDied","Data":"8e3ba57210dc8d1c17bd479fd8090305ab6c61257a5e795542328de4a413f4f8"} Oct 02 09:50:15 crc kubenswrapper[4722]: I1002 09:50:15.739470 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-db-create-llpsl" event={"ID":"4715d909-cf7b-4a50-add4-38e3482e6a5a","Type":"ContainerStarted","Data":"fd349acfe70c0a2d48594d978a1dba7c87a660b1d3cc1e62983816d1d2937cc0"} Oct 02 09:50:17 crc kubenswrapper[4722]: I1002 09:50:17.014062 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-db-create-llpsl" Oct 02 09:50:17 crc kubenswrapper[4722]: I1002 09:50:17.058464 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6r72\" (UniqueName: \"kubernetes.io/projected/4715d909-cf7b-4a50-add4-38e3482e6a5a-kube-api-access-h6r72\") pod \"4715d909-cf7b-4a50-add4-38e3482e6a5a\" (UID: \"4715d909-cf7b-4a50-add4-38e3482e6a5a\") " Oct 02 09:50:17 crc kubenswrapper[4722]: I1002 09:50:17.065547 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4715d909-cf7b-4a50-add4-38e3482e6a5a-kube-api-access-h6r72" (OuterVolumeSpecName: "kube-api-access-h6r72") pod "4715d909-cf7b-4a50-add4-38e3482e6a5a" (UID: "4715d909-cf7b-4a50-add4-38e3482e6a5a"). InnerVolumeSpecName "kube-api-access-h6r72". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:50:17 crc kubenswrapper[4722]: I1002 09:50:17.162051 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6r72\" (UniqueName: \"kubernetes.io/projected/4715d909-cf7b-4a50-add4-38e3482e6a5a-kube-api-access-h6r72\") on node \"crc\" DevicePath \"\"" Oct 02 09:50:17 crc kubenswrapper[4722]: I1002 09:50:17.199829 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-index-fsxkj" Oct 02 09:50:17 crc kubenswrapper[4722]: I1002 09:50:17.199882 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/barbican-operator-index-fsxkj" Oct 02 09:50:17 crc kubenswrapper[4722]: I1002 09:50:17.226792 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/barbican-operator-index-fsxkj" Oct 02 09:50:17 crc kubenswrapper[4722]: I1002 09:50:17.760301 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-db-create-llpsl" event={"ID":"4715d909-cf7b-4a50-add4-38e3482e6a5a","Type":"ContainerDied","Data":"fd349acfe70c0a2d48594d978a1dba7c87a660b1d3cc1e62983816d1d2937cc0"} Oct 02 09:50:17 crc kubenswrapper[4722]: I1002 09:50:17.760359 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd349acfe70c0a2d48594d978a1dba7c87a660b1d3cc1e62983816d1d2937cc0" Oct 02 09:50:17 crc kubenswrapper[4722]: I1002 09:50:17.760330 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-db-create-llpsl" Oct 02 09:50:17 crc kubenswrapper[4722]: I1002 09:50:17.786908 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-index-fsxkj" Oct 02 09:50:24 crc kubenswrapper[4722]: I1002 09:50:24.238067 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/keystone-9777-account-create-zbds6"] Oct 02 09:50:24 crc kubenswrapper[4722]: E1002 09:50:24.238660 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4715d909-cf7b-4a50-add4-38e3482e6a5a" containerName="mariadb-database-create" Oct 02 09:50:24 crc kubenswrapper[4722]: I1002 09:50:24.238678 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4715d909-cf7b-4a50-add4-38e3482e6a5a" containerName="mariadb-database-create" Oct 02 09:50:24 crc kubenswrapper[4722]: I1002 09:50:24.238793 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="4715d909-cf7b-4a50-add4-38e3482e6a5a" containerName="mariadb-database-create" Oct 02 09:50:24 crc kubenswrapper[4722]: I1002 09:50:24.239197 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-9777-account-create-zbds6" Oct 02 09:50:24 crc kubenswrapper[4722]: I1002 09:50:24.241090 4722 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-db-secret" Oct 02 09:50:24 crc kubenswrapper[4722]: I1002 09:50:24.246902 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-9777-account-create-zbds6"] Oct 02 09:50:24 crc kubenswrapper[4722]: I1002 09:50:24.356513 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9rpc\" (UniqueName: \"kubernetes.io/projected/a314cbe4-1509-4bb3-a6ab-45ec40fa632d-kube-api-access-p9rpc\") pod \"keystone-9777-account-create-zbds6\" (UID: \"a314cbe4-1509-4bb3-a6ab-45ec40fa632d\") " pod="barbican-kuttl-tests/keystone-9777-account-create-zbds6" Oct 02 09:50:24 crc kubenswrapper[4722]: I1002 09:50:24.457966 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9rpc\" (UniqueName: \"kubernetes.io/projected/a314cbe4-1509-4bb3-a6ab-45ec40fa632d-kube-api-access-p9rpc\") pod \"keystone-9777-account-create-zbds6\" (UID: \"a314cbe4-1509-4bb3-a6ab-45ec40fa632d\") " pod="barbican-kuttl-tests/keystone-9777-account-create-zbds6" Oct 02 09:50:24 crc kubenswrapper[4722]: I1002 09:50:24.474949 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9rpc\" (UniqueName: \"kubernetes.io/projected/a314cbe4-1509-4bb3-a6ab-45ec40fa632d-kube-api-access-p9rpc\") pod \"keystone-9777-account-create-zbds6\" (UID: \"a314cbe4-1509-4bb3-a6ab-45ec40fa632d\") " pod="barbican-kuttl-tests/keystone-9777-account-create-zbds6" Oct 02 09:50:24 crc kubenswrapper[4722]: I1002 09:50:24.558166 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-9777-account-create-zbds6" Oct 02 09:50:24 crc kubenswrapper[4722]: I1002 09:50:24.766670 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-9777-account-create-zbds6"] Oct 02 09:50:24 crc kubenswrapper[4722]: W1002 09:50:24.771657 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda314cbe4_1509_4bb3_a6ab_45ec40fa632d.slice/crio-6af4a0bc1d1ec3668a4fa7134f8027a9a8ff373d9df5d9ad3fd312b1788413f4 WatchSource:0}: Error finding container 6af4a0bc1d1ec3668a4fa7134f8027a9a8ff373d9df5d9ad3fd312b1788413f4: Status 404 returned error can't find the container with id 6af4a0bc1d1ec3668a4fa7134f8027a9a8ff373d9df5d9ad3fd312b1788413f4 Oct 02 09:50:24 crc kubenswrapper[4722]: I1002 09:50:24.779580 4722 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-db-secret" Oct 02 09:50:24 crc kubenswrapper[4722]: I1002 09:50:24.803549 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-9777-account-create-zbds6" event={"ID":"a314cbe4-1509-4bb3-a6ab-45ec40fa632d","Type":"ContainerStarted","Data":"6af4a0bc1d1ec3668a4fa7134f8027a9a8ff373d9df5d9ad3fd312b1788413f4"} Oct 02 09:50:25 crc kubenswrapper[4722]: I1002 09:50:25.810533 4722 generic.go:334] "Generic (PLEG): container finished" podID="a314cbe4-1509-4bb3-a6ab-45ec40fa632d" containerID="ab900f218fdfd68429e8fcf3ec42ad27d8504528ce371c6d45b3146b1cf402f2" exitCode=0 Oct 02 09:50:25 crc kubenswrapper[4722]: I1002 09:50:25.810605 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-9777-account-create-zbds6" event={"ID":"a314cbe4-1509-4bb3-a6ab-45ec40fa632d","Type":"ContainerDied","Data":"ab900f218fdfd68429e8fcf3ec42ad27d8504528ce371c6d45b3146b1cf402f2"} Oct 02 09:50:27 crc kubenswrapper[4722]: I1002 09:50:27.056641 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-9777-account-create-zbds6" Oct 02 09:50:27 crc kubenswrapper[4722]: I1002 09:50:27.097598 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9rpc\" (UniqueName: \"kubernetes.io/projected/a314cbe4-1509-4bb3-a6ab-45ec40fa632d-kube-api-access-p9rpc\") pod \"a314cbe4-1509-4bb3-a6ab-45ec40fa632d\" (UID: \"a314cbe4-1509-4bb3-a6ab-45ec40fa632d\") " Oct 02 09:50:27 crc kubenswrapper[4722]: I1002 09:50:27.104127 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a314cbe4-1509-4bb3-a6ab-45ec40fa632d-kube-api-access-p9rpc" (OuterVolumeSpecName: "kube-api-access-p9rpc") pod "a314cbe4-1509-4bb3-a6ab-45ec40fa632d" (UID: "a314cbe4-1509-4bb3-a6ab-45ec40fa632d"). InnerVolumeSpecName "kube-api-access-p9rpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:50:27 crc kubenswrapper[4722]: I1002 09:50:27.199584 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9rpc\" (UniqueName: \"kubernetes.io/projected/a314cbe4-1509-4bb3-a6ab-45ec40fa632d-kube-api-access-p9rpc\") on node \"crc\" DevicePath \"\"" Oct 02 09:50:27 crc kubenswrapper[4722]: I1002 09:50:27.713351 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/8335c94caa113f9a451d061ea94690b3e9b02b7104da5fa6096e5e2731pbq69"] Oct 02 09:50:27 crc kubenswrapper[4722]: E1002 09:50:27.713601 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a314cbe4-1509-4bb3-a6ab-45ec40fa632d" containerName="mariadb-account-create" Oct 02 09:50:27 crc kubenswrapper[4722]: I1002 09:50:27.713618 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a314cbe4-1509-4bb3-a6ab-45ec40fa632d" containerName="mariadb-account-create" Oct 02 09:50:27 crc kubenswrapper[4722]: I1002 09:50:27.713738 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a314cbe4-1509-4bb3-a6ab-45ec40fa632d" containerName="mariadb-account-create" Oct 02 09:50:27 crc kubenswrapper[4722]: I1002 09:50:27.716605 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8335c94caa113f9a451d061ea94690b3e9b02b7104da5fa6096e5e2731pbq69" Oct 02 09:50:27 crc kubenswrapper[4722]: I1002 09:50:27.718795 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-bc9s8" Oct 02 09:50:27 crc kubenswrapper[4722]: I1002 09:50:27.723535 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/8335c94caa113f9a451d061ea94690b3e9b02b7104da5fa6096e5e2731pbq69"] Oct 02 09:50:27 crc kubenswrapper[4722]: I1002 09:50:27.809985 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/52f6efda-6992-4bdd-9da0-b7cc70266f05-bundle\") pod \"8335c94caa113f9a451d061ea94690b3e9b02b7104da5fa6096e5e2731pbq69\" (UID: \"52f6efda-6992-4bdd-9da0-b7cc70266f05\") " pod="openstack-operators/8335c94caa113f9a451d061ea94690b3e9b02b7104da5fa6096e5e2731pbq69" Oct 02 09:50:27 crc kubenswrapper[4722]: I1002 09:50:27.810246 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/52f6efda-6992-4bdd-9da0-b7cc70266f05-util\") pod \"8335c94caa113f9a451d061ea94690b3e9b02b7104da5fa6096e5e2731pbq69\" (UID: \"52f6efda-6992-4bdd-9da0-b7cc70266f05\") " pod="openstack-operators/8335c94caa113f9a451d061ea94690b3e9b02b7104da5fa6096e5e2731pbq69" Oct 02 09:50:27 crc kubenswrapper[4722]: I1002 09:50:27.810291 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4stt\" (UniqueName: \"kubernetes.io/projected/52f6efda-6992-4bdd-9da0-b7cc70266f05-kube-api-access-l4stt\") pod \"8335c94caa113f9a451d061ea94690b3e9b02b7104da5fa6096e5e2731pbq69\" (UID: \"52f6efda-6992-4bdd-9da0-b7cc70266f05\") " pod="openstack-operators/8335c94caa113f9a451d061ea94690b3e9b02b7104da5fa6096e5e2731pbq69" Oct 02 09:50:27 crc kubenswrapper[4722]: I1002 09:50:27.823785 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-9777-account-create-zbds6" event={"ID":"a314cbe4-1509-4bb3-a6ab-45ec40fa632d","Type":"ContainerDied","Data":"6af4a0bc1d1ec3668a4fa7134f8027a9a8ff373d9df5d9ad3fd312b1788413f4"} Oct 02 09:50:27 crc kubenswrapper[4722]: I1002 09:50:27.823838 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-9777-account-create-zbds6" Oct 02 09:50:27 crc kubenswrapper[4722]: I1002 09:50:27.823845 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6af4a0bc1d1ec3668a4fa7134f8027a9a8ff373d9df5d9ad3fd312b1788413f4" Oct 02 09:50:27 crc kubenswrapper[4722]: I1002 09:50:27.911468 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/52f6efda-6992-4bdd-9da0-b7cc70266f05-bundle\") pod \"8335c94caa113f9a451d061ea94690b3e9b02b7104da5fa6096e5e2731pbq69\" (UID: \"52f6efda-6992-4bdd-9da0-b7cc70266f05\") " pod="openstack-operators/8335c94caa113f9a451d061ea94690b3e9b02b7104da5fa6096e5e2731pbq69" Oct 02 09:50:27 crc kubenswrapper[4722]: I1002 09:50:27.911577 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/52f6efda-6992-4bdd-9da0-b7cc70266f05-util\") pod \"8335c94caa113f9a451d061ea94690b3e9b02b7104da5fa6096e5e2731pbq69\" (UID: \"52f6efda-6992-4bdd-9da0-b7cc70266f05\") " pod="openstack-operators/8335c94caa113f9a451d061ea94690b3e9b02b7104da5fa6096e5e2731pbq69" Oct 02 09:50:27 crc kubenswrapper[4722]: I1002 09:50:27.911616 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4stt\" (UniqueName: \"kubernetes.io/projected/52f6efda-6992-4bdd-9da0-b7cc70266f05-kube-api-access-l4stt\") pod \"8335c94caa113f9a451d061ea94690b3e9b02b7104da5fa6096e5e2731pbq69\" (UID: \"52f6efda-6992-4bdd-9da0-b7cc70266f05\") " pod="openstack-operators/8335c94caa113f9a451d061ea94690b3e9b02b7104da5fa6096e5e2731pbq69" Oct 02 09:50:27 crc kubenswrapper[4722]: I1002 09:50:27.911966 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/52f6efda-6992-4bdd-9da0-b7cc70266f05-bundle\") pod \"8335c94caa113f9a451d061ea94690b3e9b02b7104da5fa6096e5e2731pbq69\" (UID: \"52f6efda-6992-4bdd-9da0-b7cc70266f05\") " pod="openstack-operators/8335c94caa113f9a451d061ea94690b3e9b02b7104da5fa6096e5e2731pbq69" Oct 02 09:50:27 crc kubenswrapper[4722]: I1002 09:50:27.912096 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/52f6efda-6992-4bdd-9da0-b7cc70266f05-util\") pod \"8335c94caa113f9a451d061ea94690b3e9b02b7104da5fa6096e5e2731pbq69\" (UID: \"52f6efda-6992-4bdd-9da0-b7cc70266f05\") " pod="openstack-operators/8335c94caa113f9a451d061ea94690b3e9b02b7104da5fa6096e5e2731pbq69" Oct 02 09:50:27 crc kubenswrapper[4722]: I1002 09:50:27.929403 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4stt\" (UniqueName: \"kubernetes.io/projected/52f6efda-6992-4bdd-9da0-b7cc70266f05-kube-api-access-l4stt\") pod \"8335c94caa113f9a451d061ea94690b3e9b02b7104da5fa6096e5e2731pbq69\" (UID: \"52f6efda-6992-4bdd-9da0-b7cc70266f05\") " pod="openstack-operators/8335c94caa113f9a451d061ea94690b3e9b02b7104da5fa6096e5e2731pbq69" Oct 02 09:50:28 crc kubenswrapper[4722]: I1002 09:50:28.032622 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8335c94caa113f9a451d061ea94690b3e9b02b7104da5fa6096e5e2731pbq69" Oct 02 09:50:28 crc kubenswrapper[4722]: I1002 09:50:28.240419 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/8335c94caa113f9a451d061ea94690b3e9b02b7104da5fa6096e5e2731pbq69"] Oct 02 09:50:28 crc kubenswrapper[4722]: I1002 09:50:28.842492 4722 generic.go:334] "Generic (PLEG): container finished" podID="52f6efda-6992-4bdd-9da0-b7cc70266f05" containerID="8f261b8ba30dd4058697c6f68bf43b8e9660f4bb5f5a7a8739f8777e51225e11" exitCode=0 Oct 02 09:50:28 crc kubenswrapper[4722]: I1002 09:50:28.842886 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8335c94caa113f9a451d061ea94690b3e9b02b7104da5fa6096e5e2731pbq69" event={"ID":"52f6efda-6992-4bdd-9da0-b7cc70266f05","Type":"ContainerDied","Data":"8f261b8ba30dd4058697c6f68bf43b8e9660f4bb5f5a7a8739f8777e51225e11"} Oct 02 09:50:28 crc kubenswrapper[4722]: I1002 09:50:28.842930 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8335c94caa113f9a451d061ea94690b3e9b02b7104da5fa6096e5e2731pbq69" event={"ID":"52f6efda-6992-4bdd-9da0-b7cc70266f05","Type":"ContainerStarted","Data":"04c533a2793224953dd585d931da37659f5be1699996a82a494a4740d0ae991d"} Oct 02 09:50:28 crc kubenswrapper[4722]: I1002 09:50:28.845298 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 09:50:29 crc kubenswrapper[4722]: I1002 09:50:29.728870 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/keystone-db-sync-dkjtj"] Oct 02 09:50:29 crc kubenswrapper[4722]: I1002 09:50:29.730137 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-db-sync-dkjtj" Oct 02 09:50:29 crc kubenswrapper[4722]: I1002 09:50:29.736303 4722 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone" Oct 02 09:50:29 crc kubenswrapper[4722]: I1002 09:50:29.736659 4722 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-keystone-dockercfg-wvjvq" Oct 02 09:50:29 crc kubenswrapper[4722]: I1002 09:50:29.736777 4722 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-config-data" Oct 02 09:50:29 crc kubenswrapper[4722]: I1002 09:50:29.737418 4722 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-scripts" Oct 02 09:50:29 crc kubenswrapper[4722]: I1002 09:50:29.788782 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-db-sync-dkjtj"] Oct 02 09:50:29 crc kubenswrapper[4722]: I1002 09:50:29.838733 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3df4880-3d2d-429c-87b9-cad9afbbb682-config-data\") pod \"keystone-db-sync-dkjtj\" (UID: \"b3df4880-3d2d-429c-87b9-cad9afbbb682\") " pod="barbican-kuttl-tests/keystone-db-sync-dkjtj" Oct 02 09:50:29 crc kubenswrapper[4722]: I1002 09:50:29.838889 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwx69\" (UniqueName: \"kubernetes.io/projected/b3df4880-3d2d-429c-87b9-cad9afbbb682-kube-api-access-rwx69\") pod \"keystone-db-sync-dkjtj\" (UID: \"b3df4880-3d2d-429c-87b9-cad9afbbb682\") " pod="barbican-kuttl-tests/keystone-db-sync-dkjtj" Oct 02 09:50:29 crc kubenswrapper[4722]: I1002 09:50:29.940257 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwx69\" (UniqueName: \"kubernetes.io/projected/b3df4880-3d2d-429c-87b9-cad9afbbb682-kube-api-access-rwx69\") pod \"keystone-db-sync-dkjtj\" (UID: \"b3df4880-3d2d-429c-87b9-cad9afbbb682\") " pod="barbican-kuttl-tests/keystone-db-sync-dkjtj" Oct 02 09:50:29 crc kubenswrapper[4722]: I1002 09:50:29.940342 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3df4880-3d2d-429c-87b9-cad9afbbb682-config-data\") pod \"keystone-db-sync-dkjtj\" (UID: \"b3df4880-3d2d-429c-87b9-cad9afbbb682\") " pod="barbican-kuttl-tests/keystone-db-sync-dkjtj" Oct 02 09:50:29 crc kubenswrapper[4722]: I1002 09:50:29.947409 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3df4880-3d2d-429c-87b9-cad9afbbb682-config-data\") pod \"keystone-db-sync-dkjtj\" (UID: \"b3df4880-3d2d-429c-87b9-cad9afbbb682\") " pod="barbican-kuttl-tests/keystone-db-sync-dkjtj" Oct 02 09:50:29 crc kubenswrapper[4722]: I1002 09:50:29.959486 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwx69\" (UniqueName: \"kubernetes.io/projected/b3df4880-3d2d-429c-87b9-cad9afbbb682-kube-api-access-rwx69\") pod \"keystone-db-sync-dkjtj\" (UID: \"b3df4880-3d2d-429c-87b9-cad9afbbb682\") " pod="barbican-kuttl-tests/keystone-db-sync-dkjtj" Oct 02 09:50:30 crc kubenswrapper[4722]: I1002 09:50:30.047912 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-db-sync-dkjtj" Oct 02 09:50:30 crc kubenswrapper[4722]: I1002 09:50:30.438067 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-db-sync-dkjtj"] Oct 02 09:50:30 crc kubenswrapper[4722]: W1002 09:50:30.444056 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3df4880_3d2d_429c_87b9_cad9afbbb682.slice/crio-96e2f2225e5606a06545914d2d93e6bf8ce9f057d2a1d8347d7691ebd44c6d4e WatchSource:0}: Error finding container 96e2f2225e5606a06545914d2d93e6bf8ce9f057d2a1d8347d7691ebd44c6d4e: Status 404 returned error can't find the container with id 96e2f2225e5606a06545914d2d93e6bf8ce9f057d2a1d8347d7691ebd44c6d4e Oct 02 09:50:30 crc kubenswrapper[4722]: I1002 09:50:30.856136 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-db-sync-dkjtj" event={"ID":"b3df4880-3d2d-429c-87b9-cad9afbbb682","Type":"ContainerStarted","Data":"96e2f2225e5606a06545914d2d93e6bf8ce9f057d2a1d8347d7691ebd44c6d4e"} Oct 02 09:50:30 crc kubenswrapper[4722]: I1002 09:50:30.858362 4722 generic.go:334] "Generic (PLEG): container finished" podID="52f6efda-6992-4bdd-9da0-b7cc70266f05" containerID="2bfbc801e2e046d44aa0cb629d3510aa5e15ce16c4aab501d753cc685f1eafa3" exitCode=0 Oct 02 09:50:30 crc kubenswrapper[4722]: I1002 09:50:30.858412 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8335c94caa113f9a451d061ea94690b3e9b02b7104da5fa6096e5e2731pbq69" event={"ID":"52f6efda-6992-4bdd-9da0-b7cc70266f05","Type":"ContainerDied","Data":"2bfbc801e2e046d44aa0cb629d3510aa5e15ce16c4aab501d753cc685f1eafa3"} Oct 02 09:50:31 crc kubenswrapper[4722]: I1002 09:50:31.869699 4722 generic.go:334] "Generic (PLEG): container finished" podID="52f6efda-6992-4bdd-9da0-b7cc70266f05" containerID="0303adf9b7d93a00abc9e84ff4975126e8bd2d6793f3cc4a85d2782cf8a6ab6e" exitCode=0 Oct 02 09:50:31 crc kubenswrapper[4722]: I1002 09:50:31.869754 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8335c94caa113f9a451d061ea94690b3e9b02b7104da5fa6096e5e2731pbq69" event={"ID":"52f6efda-6992-4bdd-9da0-b7cc70266f05","Type":"ContainerDied","Data":"0303adf9b7d93a00abc9e84ff4975126e8bd2d6793f3cc4a85d2782cf8a6ab6e"} Oct 02 09:50:37 crc kubenswrapper[4722]: I1002 09:50:37.987133 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8335c94caa113f9a451d061ea94690b3e9b02b7104da5fa6096e5e2731pbq69" Oct 02 09:50:38 crc kubenswrapper[4722]: I1002 09:50:38.075049 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/52f6efda-6992-4bdd-9da0-b7cc70266f05-util\") pod \"52f6efda-6992-4bdd-9da0-b7cc70266f05\" (UID: \"52f6efda-6992-4bdd-9da0-b7cc70266f05\") " Oct 02 09:50:38 crc kubenswrapper[4722]: I1002 09:50:38.075151 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/52f6efda-6992-4bdd-9da0-b7cc70266f05-bundle\") pod \"52f6efda-6992-4bdd-9da0-b7cc70266f05\" (UID: \"52f6efda-6992-4bdd-9da0-b7cc70266f05\") " Oct 02 09:50:38 crc kubenswrapper[4722]: I1002 09:50:38.075174 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4stt\" (UniqueName: \"kubernetes.io/projected/52f6efda-6992-4bdd-9da0-b7cc70266f05-kube-api-access-l4stt\") pod \"52f6efda-6992-4bdd-9da0-b7cc70266f05\" (UID: \"52f6efda-6992-4bdd-9da0-b7cc70266f05\") " Oct 02 09:50:38 crc kubenswrapper[4722]: I1002 09:50:38.076883 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52f6efda-6992-4bdd-9da0-b7cc70266f05-bundle" (OuterVolumeSpecName: "bundle") pod "52f6efda-6992-4bdd-9da0-b7cc70266f05" (UID: "52f6efda-6992-4bdd-9da0-b7cc70266f05"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:50:38 crc kubenswrapper[4722]: I1002 09:50:38.079335 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52f6efda-6992-4bdd-9da0-b7cc70266f05-kube-api-access-l4stt" (OuterVolumeSpecName: "kube-api-access-l4stt") pod "52f6efda-6992-4bdd-9da0-b7cc70266f05" (UID: "52f6efda-6992-4bdd-9da0-b7cc70266f05"). InnerVolumeSpecName "kube-api-access-l4stt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:50:38 crc kubenswrapper[4722]: I1002 09:50:38.092700 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52f6efda-6992-4bdd-9da0-b7cc70266f05-util" (OuterVolumeSpecName: "util") pod "52f6efda-6992-4bdd-9da0-b7cc70266f05" (UID: "52f6efda-6992-4bdd-9da0-b7cc70266f05"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:50:38 crc kubenswrapper[4722]: I1002 09:50:38.176552 4722 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/52f6efda-6992-4bdd-9da0-b7cc70266f05-util\") on node \"crc\" DevicePath \"\"" Oct 02 09:50:38 crc kubenswrapper[4722]: I1002 09:50:38.176603 4722 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/52f6efda-6992-4bdd-9da0-b7cc70266f05-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 09:50:38 crc kubenswrapper[4722]: I1002 09:50:38.176617 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4stt\" (UniqueName: \"kubernetes.io/projected/52f6efda-6992-4bdd-9da0-b7cc70266f05-kube-api-access-l4stt\") on node \"crc\" DevicePath \"\"" Oct 02 09:50:38 crc kubenswrapper[4722]: I1002 09:50:38.964367 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-db-sync-dkjtj" event={"ID":"b3df4880-3d2d-429c-87b9-cad9afbbb682","Type":"ContainerStarted","Data":"2a1cea57a939aa89a46fa62cd3100e07fd1ea4d89511fcf4dd690d2ea2b2cdd7"} Oct 02 09:50:38 crc kubenswrapper[4722]: I1002 09:50:38.967035 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8335c94caa113f9a451d061ea94690b3e9b02b7104da5fa6096e5e2731pbq69" event={"ID":"52f6efda-6992-4bdd-9da0-b7cc70266f05","Type":"ContainerDied","Data":"04c533a2793224953dd585d931da37659f5be1699996a82a494a4740d0ae991d"} Oct 02 09:50:38 crc kubenswrapper[4722]: I1002 09:50:38.967075 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04c533a2793224953dd585d931da37659f5be1699996a82a494a4740d0ae991d" Oct 02 09:50:38 crc kubenswrapper[4722]: I1002 09:50:38.967142 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8335c94caa113f9a451d061ea94690b3e9b02b7104da5fa6096e5e2731pbq69" Oct 02 09:50:38 crc kubenswrapper[4722]: I1002 09:50:38.979983 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/keystone-db-sync-dkjtj" podStartSLOduration=2.401195407 podStartE2EDuration="9.979963653s" podCreationTimestamp="2025-10-02 09:50:29 +0000 UTC" firstStartedPulling="2025-10-02 09:50:30.446731258 +0000 UTC m=+1086.483484238" lastFinishedPulling="2025-10-02 09:50:38.025499504 +0000 UTC m=+1094.062252484" observedRunningTime="2025-10-02 09:50:38.978947828 +0000 UTC m=+1095.015700838" watchObservedRunningTime="2025-10-02 09:50:38.979963653 +0000 UTC m=+1095.016716643" Oct 02 09:50:45 crc kubenswrapper[4722]: I1002 09:50:45.006556 4722 generic.go:334] "Generic (PLEG): container finished" podID="b3df4880-3d2d-429c-87b9-cad9afbbb682" containerID="2a1cea57a939aa89a46fa62cd3100e07fd1ea4d89511fcf4dd690d2ea2b2cdd7" exitCode=0 Oct 02 09:50:45 crc kubenswrapper[4722]: I1002 09:50:45.006653 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-db-sync-dkjtj" event={"ID":"b3df4880-3d2d-429c-87b9-cad9afbbb682","Type":"ContainerDied","Data":"2a1cea57a939aa89a46fa62cd3100e07fd1ea4d89511fcf4dd690d2ea2b2cdd7"} Oct 02 09:50:46 crc kubenswrapper[4722]: I1002 09:50:46.272464 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-db-sync-dkjtj" Oct 02 09:50:46 crc kubenswrapper[4722]: I1002 09:50:46.301001 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3df4880-3d2d-429c-87b9-cad9afbbb682-config-data\") pod \"b3df4880-3d2d-429c-87b9-cad9afbbb682\" (UID: \"b3df4880-3d2d-429c-87b9-cad9afbbb682\") " Oct 02 09:50:46 crc kubenswrapper[4722]: I1002 09:50:46.301183 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwx69\" (UniqueName: \"kubernetes.io/projected/b3df4880-3d2d-429c-87b9-cad9afbbb682-kube-api-access-rwx69\") pod \"b3df4880-3d2d-429c-87b9-cad9afbbb682\" (UID: \"b3df4880-3d2d-429c-87b9-cad9afbbb682\") " Oct 02 09:50:46 crc kubenswrapper[4722]: I1002 09:50:46.305919 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3df4880-3d2d-429c-87b9-cad9afbbb682-kube-api-access-rwx69" (OuterVolumeSpecName: "kube-api-access-rwx69") pod "b3df4880-3d2d-429c-87b9-cad9afbbb682" (UID: "b3df4880-3d2d-429c-87b9-cad9afbbb682"). InnerVolumeSpecName "kube-api-access-rwx69". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:50:46 crc kubenswrapper[4722]: I1002 09:50:46.333601 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3df4880-3d2d-429c-87b9-cad9afbbb682-config-data" (OuterVolumeSpecName: "config-data") pod "b3df4880-3d2d-429c-87b9-cad9afbbb682" (UID: "b3df4880-3d2d-429c-87b9-cad9afbbb682"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:50:46 crc kubenswrapper[4722]: I1002 09:50:46.402420 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwx69\" (UniqueName: \"kubernetes.io/projected/b3df4880-3d2d-429c-87b9-cad9afbbb682-kube-api-access-rwx69\") on node \"crc\" DevicePath \"\"" Oct 02 09:50:46 crc kubenswrapper[4722]: I1002 09:50:46.402467 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3df4880-3d2d-429c-87b9-cad9afbbb682-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 09:50:46 crc kubenswrapper[4722]: I1002 09:50:46.959181 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7dbf94768f-lxtzw"] Oct 02 09:50:46 crc kubenswrapper[4722]: E1002 09:50:46.959935 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52f6efda-6992-4bdd-9da0-b7cc70266f05" containerName="util" Oct 02 09:50:46 crc kubenswrapper[4722]: I1002 09:50:46.959952 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="52f6efda-6992-4bdd-9da0-b7cc70266f05" containerName="util" Oct 02 09:50:46 crc kubenswrapper[4722]: E1002 09:50:46.959968 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52f6efda-6992-4bdd-9da0-b7cc70266f05" containerName="extract" Oct 02 09:50:46 crc kubenswrapper[4722]: I1002 09:50:46.959975 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="52f6efda-6992-4bdd-9da0-b7cc70266f05" containerName="extract" Oct 02 09:50:46 crc kubenswrapper[4722]: E1002 09:50:46.959987 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3df4880-3d2d-429c-87b9-cad9afbbb682" containerName="keystone-db-sync" Oct 02 09:50:46 crc kubenswrapper[4722]: I1002 09:50:46.959995 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3df4880-3d2d-429c-87b9-cad9afbbb682" containerName="keystone-db-sync" Oct 02 09:50:46 crc kubenswrapper[4722]: E1002 09:50:46.960012 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52f6efda-6992-4bdd-9da0-b7cc70266f05" containerName="pull" Oct 02 09:50:46 crc kubenswrapper[4722]: I1002 09:50:46.960020 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="52f6efda-6992-4bdd-9da0-b7cc70266f05" containerName="pull" Oct 02 09:50:46 crc kubenswrapper[4722]: I1002 09:50:46.960154 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3df4880-3d2d-429c-87b9-cad9afbbb682" containerName="keystone-db-sync" Oct 02 09:50:46 crc kubenswrapper[4722]: I1002 09:50:46.960181 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="52f6efda-6992-4bdd-9da0-b7cc70266f05" containerName="extract" Oct 02 09:50:46 crc kubenswrapper[4722]: I1002 09:50:46.961009 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7dbf94768f-lxtzw" Oct 02 09:50:46 crc kubenswrapper[4722]: I1002 09:50:46.975972 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-7kbkd" Oct 02 09:50:46 crc kubenswrapper[4722]: I1002 09:50:46.976535 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-service-cert" Oct 02 09:50:47 crc kubenswrapper[4722]: I1002 09:50:47.012389 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ff0e8efc-3b56-4f5e-9944-9e34a644c05e-webhook-cert\") pod \"barbican-operator-controller-manager-7dbf94768f-lxtzw\" (UID: \"ff0e8efc-3b56-4f5e-9944-9e34a644c05e\") " pod="openstack-operators/barbican-operator-controller-manager-7dbf94768f-lxtzw" Oct 02 09:50:47 crc kubenswrapper[4722]: I1002 09:50:47.012469 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z7jn\" (UniqueName: \"kubernetes.io/projected/ff0e8efc-3b56-4f5e-9944-9e34a644c05e-kube-api-access-8z7jn\") pod \"barbican-operator-controller-manager-7dbf94768f-lxtzw\" (UID: \"ff0e8efc-3b56-4f5e-9944-9e34a644c05e\") " pod="openstack-operators/barbican-operator-controller-manager-7dbf94768f-lxtzw" Oct 02 09:50:47 crc kubenswrapper[4722]: I1002 09:50:47.012518 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ff0e8efc-3b56-4f5e-9944-9e34a644c05e-apiservice-cert\") pod \"barbican-operator-controller-manager-7dbf94768f-lxtzw\" (UID: \"ff0e8efc-3b56-4f5e-9944-9e34a644c05e\") " pod="openstack-operators/barbican-operator-controller-manager-7dbf94768f-lxtzw" Oct 02 09:50:47 crc kubenswrapper[4722]: I1002 09:50:47.019561 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-db-sync-dkjtj" event={"ID":"b3df4880-3d2d-429c-87b9-cad9afbbb682","Type":"ContainerDied","Data":"96e2f2225e5606a06545914d2d93e6bf8ce9f057d2a1d8347d7691ebd44c6d4e"} Oct 02 09:50:47 crc kubenswrapper[4722]: I1002 09:50:47.019600 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96e2f2225e5606a06545914d2d93e6bf8ce9f057d2a1d8347d7691ebd44c6d4e" Oct 02 09:50:47 crc kubenswrapper[4722]: I1002 09:50:47.019650 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-db-sync-dkjtj" Oct 02 09:50:47 crc kubenswrapper[4722]: I1002 09:50:47.047881 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7dbf94768f-lxtzw"] Oct 02 09:50:47 crc kubenswrapper[4722]: I1002 09:50:47.113294 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ff0e8efc-3b56-4f5e-9944-9e34a644c05e-apiservice-cert\") pod \"barbican-operator-controller-manager-7dbf94768f-lxtzw\" (UID: \"ff0e8efc-3b56-4f5e-9944-9e34a644c05e\") " pod="openstack-operators/barbican-operator-controller-manager-7dbf94768f-lxtzw" Oct 02 09:50:47 crc kubenswrapper[4722]: I1002 09:50:47.113407 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ff0e8efc-3b56-4f5e-9944-9e34a644c05e-webhook-cert\") pod \"barbican-operator-controller-manager-7dbf94768f-lxtzw\" (UID: \"ff0e8efc-3b56-4f5e-9944-9e34a644c05e\") " pod="openstack-operators/barbican-operator-controller-manager-7dbf94768f-lxtzw" Oct 02 09:50:47 crc kubenswrapper[4722]: I1002 09:50:47.113447 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z7jn\" (UniqueName: \"kubernetes.io/projected/ff0e8efc-3b56-4f5e-9944-9e34a644c05e-kube-api-access-8z7jn\") pod \"barbican-operator-controller-manager-7dbf94768f-lxtzw\" (UID: \"ff0e8efc-3b56-4f5e-9944-9e34a644c05e\") " pod="openstack-operators/barbican-operator-controller-manager-7dbf94768f-lxtzw" Oct 02 09:50:47 crc kubenswrapper[4722]: I1002 09:50:47.119129 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ff0e8efc-3b56-4f5e-9944-9e34a644c05e-apiservice-cert\") pod \"barbican-operator-controller-manager-7dbf94768f-lxtzw\" (UID: \"ff0e8efc-3b56-4f5e-9944-9e34a644c05e\") " pod="openstack-operators/barbican-operator-controller-manager-7dbf94768f-lxtzw" Oct 02 09:50:47 crc kubenswrapper[4722]: I1002 09:50:47.119566 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ff0e8efc-3b56-4f5e-9944-9e34a644c05e-webhook-cert\") pod \"barbican-operator-controller-manager-7dbf94768f-lxtzw\" (UID: \"ff0e8efc-3b56-4f5e-9944-9e34a644c05e\") " pod="openstack-operators/barbican-operator-controller-manager-7dbf94768f-lxtzw" Oct 02 09:50:47 crc kubenswrapper[4722]: I1002 09:50:47.134955 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z7jn\" (UniqueName: \"kubernetes.io/projected/ff0e8efc-3b56-4f5e-9944-9e34a644c05e-kube-api-access-8z7jn\") pod \"barbican-operator-controller-manager-7dbf94768f-lxtzw\" (UID: \"ff0e8efc-3b56-4f5e-9944-9e34a644c05e\") " pod="openstack-operators/barbican-operator-controller-manager-7dbf94768f-lxtzw" Oct 02 09:50:47 crc kubenswrapper[4722]: I1002 09:50:47.230891 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/keystone-bootstrap-chdcv"] Oct 02 09:50:47 crc kubenswrapper[4722]: I1002 09:50:47.231740 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-bootstrap-chdcv" Oct 02 09:50:47 crc kubenswrapper[4722]: I1002 09:50:47.233700 4722 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-config-data" Oct 02 09:50:47 crc kubenswrapper[4722]: I1002 09:50:47.234015 4722 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-keystone-dockercfg-wvjvq" Oct 02 09:50:47 crc kubenswrapper[4722]: I1002 09:50:47.234962 4722 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-scripts" Oct 02 09:50:47 crc kubenswrapper[4722]: I1002 09:50:47.235129 4722 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone" Oct 02 09:50:47 crc kubenswrapper[4722]: I1002 09:50:47.277575 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-bootstrap-chdcv"] Oct 02 09:50:47 crc kubenswrapper[4722]: I1002 09:50:47.277902 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7dbf94768f-lxtzw" Oct 02 09:50:47 crc kubenswrapper[4722]: I1002 09:50:47.316124 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6107a52-6863-4d02-a3ec-dc6161bfac26-config-data\") pod \"keystone-bootstrap-chdcv\" (UID: \"c6107a52-6863-4d02-a3ec-dc6161bfac26\") " pod="barbican-kuttl-tests/keystone-bootstrap-chdcv" Oct 02 09:50:47 crc kubenswrapper[4722]: I1002 09:50:47.316172 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c6107a52-6863-4d02-a3ec-dc6161bfac26-fernet-keys\") pod \"keystone-bootstrap-chdcv\" (UID: \"c6107a52-6863-4d02-a3ec-dc6161bfac26\") " pod="barbican-kuttl-tests/keystone-bootstrap-chdcv" Oct 02 09:50:47 crc kubenswrapper[4722]: I1002 09:50:47.316198 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c6107a52-6863-4d02-a3ec-dc6161bfac26-credential-keys\") pod \"keystone-bootstrap-chdcv\" (UID: \"c6107a52-6863-4d02-a3ec-dc6161bfac26\") " pod="barbican-kuttl-tests/keystone-bootstrap-chdcv" Oct 02 09:50:47 crc kubenswrapper[4722]: I1002 09:50:47.316277 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6107a52-6863-4d02-a3ec-dc6161bfac26-scripts\") pod \"keystone-bootstrap-chdcv\" (UID: \"c6107a52-6863-4d02-a3ec-dc6161bfac26\") " pod="barbican-kuttl-tests/keystone-bootstrap-chdcv" Oct 02 09:50:47 crc kubenswrapper[4722]: I1002 09:50:47.316305 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlx8s\" (UniqueName: \"kubernetes.io/projected/c6107a52-6863-4d02-a3ec-dc6161bfac26-kube-api-access-nlx8s\") pod \"keystone-bootstrap-chdcv\" (UID: \"c6107a52-6863-4d02-a3ec-dc6161bfac26\") " pod="barbican-kuttl-tests/keystone-bootstrap-chdcv" Oct 02 09:50:47 crc kubenswrapper[4722]: I1002 09:50:47.418070 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6107a52-6863-4d02-a3ec-dc6161bfac26-scripts\") pod \"keystone-bootstrap-chdcv\" (UID: \"c6107a52-6863-4d02-a3ec-dc6161bfac26\") " pod="barbican-kuttl-tests/keystone-bootstrap-chdcv" Oct 02 09:50:47 crc kubenswrapper[4722]: I1002 09:50:47.418477 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlx8s\" (UniqueName: \"kubernetes.io/projected/c6107a52-6863-4d02-a3ec-dc6161bfac26-kube-api-access-nlx8s\") pod \"keystone-bootstrap-chdcv\" (UID: \"c6107a52-6863-4d02-a3ec-dc6161bfac26\") " pod="barbican-kuttl-tests/keystone-bootstrap-chdcv" Oct 02 09:50:47 crc kubenswrapper[4722]: I1002 09:50:47.418529 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6107a52-6863-4d02-a3ec-dc6161bfac26-config-data\") pod \"keystone-bootstrap-chdcv\" (UID: \"c6107a52-6863-4d02-a3ec-dc6161bfac26\") " pod="barbican-kuttl-tests/keystone-bootstrap-chdcv" Oct 02 09:50:47 crc kubenswrapper[4722]: I1002 09:50:47.418548 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c6107a52-6863-4d02-a3ec-dc6161bfac26-fernet-keys\") pod \"keystone-bootstrap-chdcv\" (UID: \"c6107a52-6863-4d02-a3ec-dc6161bfac26\") " pod="barbican-kuttl-tests/keystone-bootstrap-chdcv" Oct 02 09:50:47 crc kubenswrapper[4722]: I1002 09:50:47.418571 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c6107a52-6863-4d02-a3ec-dc6161bfac26-credential-keys\") pod \"keystone-bootstrap-chdcv\" (UID: \"c6107a52-6863-4d02-a3ec-dc6161bfac26\") " pod="barbican-kuttl-tests/keystone-bootstrap-chdcv" Oct 02 09:50:47 crc kubenswrapper[4722]: I1002 09:50:47.424173 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6107a52-6863-4d02-a3ec-dc6161bfac26-scripts\") pod \"keystone-bootstrap-chdcv\" (UID: \"c6107a52-6863-4d02-a3ec-dc6161bfac26\") " pod="barbican-kuttl-tests/keystone-bootstrap-chdcv" Oct 02 09:50:47 crc kubenswrapper[4722]: I1002 09:50:47.424218 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c6107a52-6863-4d02-a3ec-dc6161bfac26-fernet-keys\") pod \"keystone-bootstrap-chdcv\" (UID: \"c6107a52-6863-4d02-a3ec-dc6161bfac26\") " pod="barbican-kuttl-tests/keystone-bootstrap-chdcv" Oct 02 09:50:47 crc kubenswrapper[4722]: I1002 09:50:47.437772 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c6107a52-6863-4d02-a3ec-dc6161bfac26-credential-keys\") pod \"keystone-bootstrap-chdcv\" (UID: \"c6107a52-6863-4d02-a3ec-dc6161bfac26\") " pod="barbican-kuttl-tests/keystone-bootstrap-chdcv" Oct 02 09:50:47 crc kubenswrapper[4722]: I1002 09:50:47.441058 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlx8s\" (UniqueName: \"kubernetes.io/projected/c6107a52-6863-4d02-a3ec-dc6161bfac26-kube-api-access-nlx8s\") pod \"keystone-bootstrap-chdcv\" (UID: \"c6107a52-6863-4d02-a3ec-dc6161bfac26\") " pod="barbican-kuttl-tests/keystone-bootstrap-chdcv" Oct 02 09:50:47 crc kubenswrapper[4722]: I1002 09:50:47.442369 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6107a52-6863-4d02-a3ec-dc6161bfac26-config-data\") pod \"keystone-bootstrap-chdcv\" (UID: \"c6107a52-6863-4d02-a3ec-dc6161bfac26\") " pod="barbican-kuttl-tests/keystone-bootstrap-chdcv" Oct 02 09:50:47 crc kubenswrapper[4722]: I1002 09:50:47.529255 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7dbf94768f-lxtzw"] Oct 02 09:50:47 crc kubenswrapper[4722]: I1002 09:50:47.562097 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-bootstrap-chdcv" Oct 02 09:50:47 crc kubenswrapper[4722]: I1002 09:50:47.968075 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-bootstrap-chdcv"] Oct 02 09:50:47 crc kubenswrapper[4722]: W1002 09:50:47.974264 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6107a52_6863_4d02_a3ec_dc6161bfac26.slice/crio-818a6bd704f82fc371a363c87b3589b1b0f2eabbfdb487f1b43e2bb5d7fd2adf WatchSource:0}: Error finding container 818a6bd704f82fc371a363c87b3589b1b0f2eabbfdb487f1b43e2bb5d7fd2adf: Status 404 returned error can't find the container with id 818a6bd704f82fc371a363c87b3589b1b0f2eabbfdb487f1b43e2bb5d7fd2adf Oct 02 09:50:48 crc kubenswrapper[4722]: I1002 09:50:48.026824 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7dbf94768f-lxtzw" event={"ID":"ff0e8efc-3b56-4f5e-9944-9e34a644c05e","Type":"ContainerStarted","Data":"00b53174e2fde26756a7ab16b1dba5af50a7710ba53394dfd075060f2a83a662"} Oct 02 09:50:48 crc kubenswrapper[4722]: I1002 09:50:48.028036 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-bootstrap-chdcv" event={"ID":"c6107a52-6863-4d02-a3ec-dc6161bfac26","Type":"ContainerStarted","Data":"818a6bd704f82fc371a363c87b3589b1b0f2eabbfdb487f1b43e2bb5d7fd2adf"} Oct 02 09:50:49 crc kubenswrapper[4722]: I1002 09:50:49.036361 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-bootstrap-chdcv" event={"ID":"c6107a52-6863-4d02-a3ec-dc6161bfac26","Type":"ContainerStarted","Data":"2074b690c7bf32aef78eb9794e0be9a0c85d7a196b3f60274bdc74d2d054361b"} Oct 02 09:50:49 crc kubenswrapper[4722]: I1002 09:50:49.050222 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/keystone-bootstrap-chdcv" podStartSLOduration=2.050204515 podStartE2EDuration="2.050204515s" podCreationTimestamp="2025-10-02 09:50:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:50:49.049764924 +0000 UTC m=+1105.086517924" watchObservedRunningTime="2025-10-02 09:50:49.050204515 +0000 UTC m=+1105.086957495" Oct 02 09:50:50 crc kubenswrapper[4722]: I1002 09:50:50.049829 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7dbf94768f-lxtzw" event={"ID":"ff0e8efc-3b56-4f5e-9944-9e34a644c05e","Type":"ContainerStarted","Data":"dc6e86214e0e09921250cd96ef06a234f6e593f15fc9d4c3d81ec4ccc85b7cef"} Oct 02 09:50:50 crc kubenswrapper[4722]: I1002 09:50:50.050217 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7dbf94768f-lxtzw" event={"ID":"ff0e8efc-3b56-4f5e-9944-9e34a644c05e","Type":"ContainerStarted","Data":"32008f8b3c1c62a4235f2c06d26d4707b184904400b1763df6f2f6c48bd590d9"} Oct 02 09:50:50 crc kubenswrapper[4722]: I1002 09:50:50.050265 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7dbf94768f-lxtzw" Oct 02 09:50:51 crc kubenswrapper[4722]: I1002 09:50:51.056329 4722 generic.go:334] "Generic (PLEG): container finished" podID="c6107a52-6863-4d02-a3ec-dc6161bfac26" containerID="2074b690c7bf32aef78eb9794e0be9a0c85d7a196b3f60274bdc74d2d054361b" exitCode=0 Oct 02 09:50:51 crc kubenswrapper[4722]: I1002 09:50:51.056413 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-bootstrap-chdcv" event={"ID":"c6107a52-6863-4d02-a3ec-dc6161bfac26","Type":"ContainerDied","Data":"2074b690c7bf32aef78eb9794e0be9a0c85d7a196b3f60274bdc74d2d054361b"} Oct 02 09:50:51 crc kubenswrapper[4722]: I1002 09:50:51.071768 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7dbf94768f-lxtzw" podStartSLOduration=3.086711383 podStartE2EDuration="5.071750201s" podCreationTimestamp="2025-10-02 09:50:46 +0000 UTC" firstStartedPulling="2025-10-02 09:50:47.531432309 +0000 UTC m=+1103.568185289" lastFinishedPulling="2025-10-02 09:50:49.516471127 +0000 UTC m=+1105.553224107" observedRunningTime="2025-10-02 09:50:50.072770686 +0000 UTC m=+1106.109523676" watchObservedRunningTime="2025-10-02 09:50:51.071750201 +0000 UTC m=+1107.108503181" Oct 02 09:50:52 crc kubenswrapper[4722]: I1002 09:50:52.339723 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-bootstrap-chdcv" Oct 02 09:50:52 crc kubenswrapper[4722]: I1002 09:50:52.416569 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c6107a52-6863-4d02-a3ec-dc6161bfac26-fernet-keys\") pod \"c6107a52-6863-4d02-a3ec-dc6161bfac26\" (UID: \"c6107a52-6863-4d02-a3ec-dc6161bfac26\") " Oct 02 09:50:52 crc kubenswrapper[4722]: I1002 09:50:52.416753 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c6107a52-6863-4d02-a3ec-dc6161bfac26-credential-keys\") pod \"c6107a52-6863-4d02-a3ec-dc6161bfac26\" (UID: \"c6107a52-6863-4d02-a3ec-dc6161bfac26\") " Oct 02 09:50:52 crc kubenswrapper[4722]: I1002 09:50:52.416896 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlx8s\" (UniqueName: \"kubernetes.io/projected/c6107a52-6863-4d02-a3ec-dc6161bfac26-kube-api-access-nlx8s\") pod \"c6107a52-6863-4d02-a3ec-dc6161bfac26\" (UID: \"c6107a52-6863-4d02-a3ec-dc6161bfac26\") " Oct 02 09:50:52 crc kubenswrapper[4722]: I1002 09:50:52.416922 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6107a52-6863-4d02-a3ec-dc6161bfac26-config-data\") pod \"c6107a52-6863-4d02-a3ec-dc6161bfac26\" (UID: \"c6107a52-6863-4d02-a3ec-dc6161bfac26\") " Oct 02 09:50:52 crc kubenswrapper[4722]: I1002 09:50:52.416984 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6107a52-6863-4d02-a3ec-dc6161bfac26-scripts\") pod \"c6107a52-6863-4d02-a3ec-dc6161bfac26\" (UID: \"c6107a52-6863-4d02-a3ec-dc6161bfac26\") " Oct 02 09:50:52 crc kubenswrapper[4722]: I1002 09:50:52.423383 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6107a52-6863-4d02-a3ec-dc6161bfac26-scripts" (OuterVolumeSpecName: "scripts") pod "c6107a52-6863-4d02-a3ec-dc6161bfac26" (UID: "c6107a52-6863-4d02-a3ec-dc6161bfac26"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:50:52 crc kubenswrapper[4722]: I1002 09:50:52.423866 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6107a52-6863-4d02-a3ec-dc6161bfac26-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "c6107a52-6863-4d02-a3ec-dc6161bfac26" (UID: "c6107a52-6863-4d02-a3ec-dc6161bfac26"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:50:52 crc kubenswrapper[4722]: I1002 09:50:52.424031 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6107a52-6863-4d02-a3ec-dc6161bfac26-kube-api-access-nlx8s" (OuterVolumeSpecName: "kube-api-access-nlx8s") pod "c6107a52-6863-4d02-a3ec-dc6161bfac26" (UID: "c6107a52-6863-4d02-a3ec-dc6161bfac26"). InnerVolumeSpecName "kube-api-access-nlx8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:50:52 crc kubenswrapper[4722]: I1002 09:50:52.424581 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6107a52-6863-4d02-a3ec-dc6161bfac26-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c6107a52-6863-4d02-a3ec-dc6161bfac26" (UID: "c6107a52-6863-4d02-a3ec-dc6161bfac26"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:50:52 crc kubenswrapper[4722]: I1002 09:50:52.437970 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6107a52-6863-4d02-a3ec-dc6161bfac26-config-data" (OuterVolumeSpecName: "config-data") pod "c6107a52-6863-4d02-a3ec-dc6161bfac26" (UID: "c6107a52-6863-4d02-a3ec-dc6161bfac26"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:50:52 crc kubenswrapper[4722]: I1002 09:50:52.518974 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlx8s\" (UniqueName: \"kubernetes.io/projected/c6107a52-6863-4d02-a3ec-dc6161bfac26-kube-api-access-nlx8s\") on node \"crc\" DevicePath \"\"" Oct 02 09:50:52 crc kubenswrapper[4722]: I1002 09:50:52.519042 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6107a52-6863-4d02-a3ec-dc6161bfac26-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 09:50:52 crc kubenswrapper[4722]: I1002 09:50:52.519056 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6107a52-6863-4d02-a3ec-dc6161bfac26-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 09:50:52 crc kubenswrapper[4722]: I1002 09:50:52.519068 4722 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c6107a52-6863-4d02-a3ec-dc6161bfac26-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 02 09:50:52 crc kubenswrapper[4722]: I1002 09:50:52.519082 4722 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c6107a52-6863-4d02-a3ec-dc6161bfac26-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 02 09:50:53 crc kubenswrapper[4722]: I1002 09:50:53.071445 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-bootstrap-chdcv" event={"ID":"c6107a52-6863-4d02-a3ec-dc6161bfac26","Type":"ContainerDied","Data":"818a6bd704f82fc371a363c87b3589b1b0f2eabbfdb487f1b43e2bb5d7fd2adf"} Oct 02 09:50:53 crc kubenswrapper[4722]: I1002 09:50:53.071482 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="818a6bd704f82fc371a363c87b3589b1b0f2eabbfdb487f1b43e2bb5d7fd2adf" Oct 02 09:50:53 crc kubenswrapper[4722]: I1002 09:50:53.071488 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-bootstrap-chdcv" Oct 02 09:50:53 crc kubenswrapper[4722]: I1002 09:50:53.140252 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/keystone-54b885497-t8c9k"] Oct 02 09:50:53 crc kubenswrapper[4722]: E1002 09:50:53.140581 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6107a52-6863-4d02-a3ec-dc6161bfac26" containerName="keystone-bootstrap" Oct 02 09:50:53 crc kubenswrapper[4722]: I1002 09:50:53.140603 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6107a52-6863-4d02-a3ec-dc6161bfac26" containerName="keystone-bootstrap" Oct 02 09:50:53 crc kubenswrapper[4722]: I1002 09:50:53.140788 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6107a52-6863-4d02-a3ec-dc6161bfac26" containerName="keystone-bootstrap" Oct 02 09:50:53 crc kubenswrapper[4722]: I1002 09:50:53.141359 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-54b885497-t8c9k" Oct 02 09:50:53 crc kubenswrapper[4722]: I1002 09:50:53.143539 4722 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-scripts" Oct 02 09:50:53 crc kubenswrapper[4722]: I1002 09:50:53.143813 4722 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-keystone-dockercfg-wvjvq" Oct 02 09:50:53 crc kubenswrapper[4722]: I1002 09:50:53.143988 4722 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-config-data" Oct 02 09:50:53 crc kubenswrapper[4722]: I1002 09:50:53.148188 4722 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone" Oct 02 09:50:53 crc kubenswrapper[4722]: I1002 09:50:53.151914 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-54b885497-t8c9k"] Oct 02 09:50:53 crc kubenswrapper[4722]: I1002 09:50:53.227716 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8198b0a7-cf5a-41f0-a528-0f3833ff97a6-credential-keys\") pod \"keystone-54b885497-t8c9k\" (UID: \"8198b0a7-cf5a-41f0-a528-0f3833ff97a6\") " pod="barbican-kuttl-tests/keystone-54b885497-t8c9k" Oct 02 09:50:53 crc kubenswrapper[4722]: I1002 09:50:53.227778 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8198b0a7-cf5a-41f0-a528-0f3833ff97a6-config-data\") pod \"keystone-54b885497-t8c9k\" (UID: \"8198b0a7-cf5a-41f0-a528-0f3833ff97a6\") " pod="barbican-kuttl-tests/keystone-54b885497-t8c9k" Oct 02 09:50:53 crc kubenswrapper[4722]: I1002 09:50:53.227944 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8198b0a7-cf5a-41f0-a528-0f3833ff97a6-fernet-keys\") pod \"keystone-54b885497-t8c9k\" (UID: \"8198b0a7-cf5a-41f0-a528-0f3833ff97a6\") " pod="barbican-kuttl-tests/keystone-54b885497-t8c9k" Oct 02 09:50:53 crc kubenswrapper[4722]: I1002 09:50:53.228042 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8198b0a7-cf5a-41f0-a528-0f3833ff97a6-scripts\") pod \"keystone-54b885497-t8c9k\" (UID: \"8198b0a7-cf5a-41f0-a528-0f3833ff97a6\") " pod="barbican-kuttl-tests/keystone-54b885497-t8c9k" Oct 02 09:50:53 crc kubenswrapper[4722]: I1002 09:50:53.228094 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2btq\" (UniqueName: \"kubernetes.io/projected/8198b0a7-cf5a-41f0-a528-0f3833ff97a6-kube-api-access-z2btq\") pod \"keystone-54b885497-t8c9k\" (UID: \"8198b0a7-cf5a-41f0-a528-0f3833ff97a6\") " pod="barbican-kuttl-tests/keystone-54b885497-t8c9k" Oct 02 09:50:53 crc kubenswrapper[4722]: I1002 09:50:53.329136 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8198b0a7-cf5a-41f0-a528-0f3833ff97a6-config-data\") pod \"keystone-54b885497-t8c9k\" (UID: \"8198b0a7-cf5a-41f0-a528-0f3833ff97a6\") " pod="barbican-kuttl-tests/keystone-54b885497-t8c9k" Oct 02 09:50:53 crc kubenswrapper[4722]: I1002 09:50:53.329195 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8198b0a7-cf5a-41f0-a528-0f3833ff97a6-fernet-keys\") pod \"keystone-54b885497-t8c9k\" (UID: \"8198b0a7-cf5a-41f0-a528-0f3833ff97a6\") " pod="barbican-kuttl-tests/keystone-54b885497-t8c9k" Oct 02 09:50:53 crc kubenswrapper[4722]: I1002 09:50:53.329238 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8198b0a7-cf5a-41f0-a528-0f3833ff97a6-scripts\") pod \"keystone-54b885497-t8c9k\" (UID: \"8198b0a7-cf5a-41f0-a528-0f3833ff97a6\") " pod="barbican-kuttl-tests/keystone-54b885497-t8c9k" Oct 02 09:50:53 crc kubenswrapper[4722]: I1002 09:50:53.329287 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2btq\" (UniqueName: \"kubernetes.io/projected/8198b0a7-cf5a-41f0-a528-0f3833ff97a6-kube-api-access-z2btq\") pod \"keystone-54b885497-t8c9k\" (UID: \"8198b0a7-cf5a-41f0-a528-0f3833ff97a6\") " pod="barbican-kuttl-tests/keystone-54b885497-t8c9k" Oct 02 09:50:53 crc kubenswrapper[4722]: I1002 09:50:53.329342 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8198b0a7-cf5a-41f0-a528-0f3833ff97a6-credential-keys\") pod \"keystone-54b885497-t8c9k\" (UID: \"8198b0a7-cf5a-41f0-a528-0f3833ff97a6\") " pod="barbican-kuttl-tests/keystone-54b885497-t8c9k" Oct 02 09:50:53 crc kubenswrapper[4722]: I1002 09:50:53.335428 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8198b0a7-cf5a-41f0-a528-0f3833ff97a6-credential-keys\") pod \"keystone-54b885497-t8c9k\" (UID: \"8198b0a7-cf5a-41f0-a528-0f3833ff97a6\") " pod="barbican-kuttl-tests/keystone-54b885497-t8c9k" Oct 02 09:50:53 crc kubenswrapper[4722]: I1002 09:50:53.336498 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8198b0a7-cf5a-41f0-a528-0f3833ff97a6-fernet-keys\") pod \"keystone-54b885497-t8c9k\" (UID: \"8198b0a7-cf5a-41f0-a528-0f3833ff97a6\") " pod="barbican-kuttl-tests/keystone-54b885497-t8c9k" Oct 02 09:50:53 crc kubenswrapper[4722]: I1002 09:50:53.341872 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8198b0a7-cf5a-41f0-a528-0f3833ff97a6-scripts\") pod \"keystone-54b885497-t8c9k\" (UID: \"8198b0a7-cf5a-41f0-a528-0f3833ff97a6\") " pod="barbican-kuttl-tests/keystone-54b885497-t8c9k" Oct 02 09:50:53 crc kubenswrapper[4722]: I1002 09:50:53.348857 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2btq\" (UniqueName: \"kubernetes.io/projected/8198b0a7-cf5a-41f0-a528-0f3833ff97a6-kube-api-access-z2btq\") pod \"keystone-54b885497-t8c9k\" (UID: \"8198b0a7-cf5a-41f0-a528-0f3833ff97a6\") " pod="barbican-kuttl-tests/keystone-54b885497-t8c9k" Oct 02 09:50:53 crc kubenswrapper[4722]: I1002 09:50:53.350199 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8198b0a7-cf5a-41f0-a528-0f3833ff97a6-config-data\") pod \"keystone-54b885497-t8c9k\" (UID: \"8198b0a7-cf5a-41f0-a528-0f3833ff97a6\") " pod="barbican-kuttl-tests/keystone-54b885497-t8c9k" Oct 02 09:50:53 crc kubenswrapper[4722]: I1002 09:50:53.457575 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-54b885497-t8c9k" Oct 02 09:50:53 crc kubenswrapper[4722]: I1002 09:50:53.848115 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-54b885497-t8c9k"] Oct 02 09:50:54 crc kubenswrapper[4722]: I1002 09:50:54.085870 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-54b885497-t8c9k" event={"ID":"8198b0a7-cf5a-41f0-a528-0f3833ff97a6","Type":"ContainerStarted","Data":"e329ca8245e6f8ad5d6e82c79296035a0ba51fcaabed81266e371466d8d27b9f"} Oct 02 09:50:54 crc kubenswrapper[4722]: I1002 09:50:54.086295 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-54b885497-t8c9k" event={"ID":"8198b0a7-cf5a-41f0-a528-0f3833ff97a6","Type":"ContainerStarted","Data":"56dd6ce64b50e82aa885784f29ea21a983bef1b28e5a012629c0774e58238368"} Oct 02 09:50:54 crc kubenswrapper[4722]: I1002 09:50:54.086533 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/keystone-54b885497-t8c9k" Oct 02 09:50:54 crc kubenswrapper[4722]: I1002 09:50:54.103772 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/keystone-54b885497-t8c9k" podStartSLOduration=1.103752815 podStartE2EDuration="1.103752815s" podCreationTimestamp="2025-10-02 09:50:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:50:54.100415992 +0000 UTC m=+1110.137169002" watchObservedRunningTime="2025-10-02 09:50:54.103752815 +0000 UTC m=+1110.140505795" Oct 02 09:50:57 crc kubenswrapper[4722]: I1002 09:50:57.281446 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7dbf94768f-lxtzw" Oct 02 09:50:59 crc kubenswrapper[4722]: I1002 09:50:59.218751 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-db-create-dn5np"] Oct 02 09:50:59 crc kubenswrapper[4722]: I1002 09:50:59.221260 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-create-dn5np" Oct 02 09:50:59 crc kubenswrapper[4722]: I1002 09:50:59.228746 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-db-create-dn5np"] Oct 02 09:50:59 crc kubenswrapper[4722]: I1002 09:50:59.317038 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhw24\" (UniqueName: \"kubernetes.io/projected/d4e64138-2948-4ba2-9cab-d87233718abe-kube-api-access-zhw24\") pod \"barbican-db-create-dn5np\" (UID: \"d4e64138-2948-4ba2-9cab-d87233718abe\") " pod="barbican-kuttl-tests/barbican-db-create-dn5np" Oct 02 09:50:59 crc kubenswrapper[4722]: I1002 09:50:59.418013 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhw24\" (UniqueName: \"kubernetes.io/projected/d4e64138-2948-4ba2-9cab-d87233718abe-kube-api-access-zhw24\") pod \"barbican-db-create-dn5np\" (UID: \"d4e64138-2948-4ba2-9cab-d87233718abe\") " pod="barbican-kuttl-tests/barbican-db-create-dn5np" Oct 02 09:50:59 crc kubenswrapper[4722]: I1002 09:50:59.435492 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhw24\" (UniqueName: \"kubernetes.io/projected/d4e64138-2948-4ba2-9cab-d87233718abe-kube-api-access-zhw24\") pod \"barbican-db-create-dn5np\" (UID: \"d4e64138-2948-4ba2-9cab-d87233718abe\") " pod="barbican-kuttl-tests/barbican-db-create-dn5np" Oct 02 09:50:59 crc kubenswrapper[4722]: I1002 09:50:59.543059 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-create-dn5np" Oct 02 09:50:59 crc kubenswrapper[4722]: I1002 09:50:59.981899 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-db-create-dn5np"] Oct 02 09:51:00 crc kubenswrapper[4722]: I1002 09:51:00.126374 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-create-dn5np" event={"ID":"d4e64138-2948-4ba2-9cab-d87233718abe","Type":"ContainerStarted","Data":"4919dba784eee591652531b7ecf516b36aa56956948a5ca790b4036c1badfc5b"} Oct 02 09:51:01 crc kubenswrapper[4722]: I1002 09:51:01.136750 4722 generic.go:334] "Generic (PLEG): container finished" podID="d4e64138-2948-4ba2-9cab-d87233718abe" containerID="22f402bd2bf520867d8f8c91b2db03d128c3da48c70db6574dc05106c174a419" exitCode=0 Oct 02 09:51:01 crc kubenswrapper[4722]: I1002 09:51:01.136822 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-create-dn5np" event={"ID":"d4e64138-2948-4ba2-9cab-d87233718abe","Type":"ContainerDied","Data":"22f402bd2bf520867d8f8c91b2db03d128c3da48c70db6574dc05106c174a419"} Oct 02 09:51:02 crc kubenswrapper[4722]: I1002 09:51:02.372184 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-create-dn5np" Oct 02 09:51:02 crc kubenswrapper[4722]: I1002 09:51:02.562067 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhw24\" (UniqueName: \"kubernetes.io/projected/d4e64138-2948-4ba2-9cab-d87233718abe-kube-api-access-zhw24\") pod \"d4e64138-2948-4ba2-9cab-d87233718abe\" (UID: \"d4e64138-2948-4ba2-9cab-d87233718abe\") " Oct 02 09:51:02 crc kubenswrapper[4722]: I1002 09:51:02.568120 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4e64138-2948-4ba2-9cab-d87233718abe-kube-api-access-zhw24" (OuterVolumeSpecName: "kube-api-access-zhw24") pod "d4e64138-2948-4ba2-9cab-d87233718abe" (UID: "d4e64138-2948-4ba2-9cab-d87233718abe"). InnerVolumeSpecName "kube-api-access-zhw24". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:51:02 crc kubenswrapper[4722]: I1002 09:51:02.666443 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhw24\" (UniqueName: \"kubernetes.io/projected/d4e64138-2948-4ba2-9cab-d87233718abe-kube-api-access-zhw24\") on node \"crc\" DevicePath \"\"" Oct 02 09:51:03 crc kubenswrapper[4722]: I1002 09:51:03.151014 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-create-dn5np" event={"ID":"d4e64138-2948-4ba2-9cab-d87233718abe","Type":"ContainerDied","Data":"4919dba784eee591652531b7ecf516b36aa56956948a5ca790b4036c1badfc5b"} Oct 02 09:51:03 crc kubenswrapper[4722]: I1002 09:51:03.151473 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4919dba784eee591652531b7ecf516b36aa56956948a5ca790b4036c1badfc5b" Oct 02 09:51:03 crc kubenswrapper[4722]: I1002 09:51:03.151077 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-create-dn5np" Oct 02 09:51:09 crc kubenswrapper[4722]: I1002 09:51:09.224702 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-153a-account-create-bxl99"] Oct 02 09:51:09 crc kubenswrapper[4722]: E1002 09:51:09.226112 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4e64138-2948-4ba2-9cab-d87233718abe" containerName="mariadb-database-create" Oct 02 09:51:09 crc kubenswrapper[4722]: I1002 09:51:09.226132 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4e64138-2948-4ba2-9cab-d87233718abe" containerName="mariadb-database-create" Oct 02 09:51:09 crc kubenswrapper[4722]: I1002 09:51:09.226296 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4e64138-2948-4ba2-9cab-d87233718abe" containerName="mariadb-database-create" Oct 02 09:51:09 crc kubenswrapper[4722]: I1002 09:51:09.227080 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-153a-account-create-bxl99" Oct 02 09:51:09 crc kubenswrapper[4722]: I1002 09:51:09.230145 4722 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-db-secret" Oct 02 09:51:09 crc kubenswrapper[4722]: I1002 09:51:09.244369 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-153a-account-create-bxl99"] Oct 02 09:51:09 crc kubenswrapper[4722]: I1002 09:51:09.357469 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br5jb\" (UniqueName: \"kubernetes.io/projected/fe829993-3dea-4c1c-826f-3d8cea2363e7-kube-api-access-br5jb\") pod \"barbican-153a-account-create-bxl99\" (UID: \"fe829993-3dea-4c1c-826f-3d8cea2363e7\") " pod="barbican-kuttl-tests/barbican-153a-account-create-bxl99" Oct 02 09:51:09 crc kubenswrapper[4722]: I1002 09:51:09.458711 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-br5jb\" (UniqueName: \"kubernetes.io/projected/fe829993-3dea-4c1c-826f-3d8cea2363e7-kube-api-access-br5jb\") pod \"barbican-153a-account-create-bxl99\" (UID: \"fe829993-3dea-4c1c-826f-3d8cea2363e7\") " pod="barbican-kuttl-tests/barbican-153a-account-create-bxl99" Oct 02 09:51:09 crc kubenswrapper[4722]: I1002 09:51:09.477470 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-br5jb\" (UniqueName: \"kubernetes.io/projected/fe829993-3dea-4c1c-826f-3d8cea2363e7-kube-api-access-br5jb\") pod \"barbican-153a-account-create-bxl99\" (UID: \"fe829993-3dea-4c1c-826f-3d8cea2363e7\") " pod="barbican-kuttl-tests/barbican-153a-account-create-bxl99" Oct 02 09:51:09 crc kubenswrapper[4722]: I1002 09:51:09.554703 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-153a-account-create-bxl99" Oct 02 09:51:09 crc kubenswrapper[4722]: I1002 09:51:09.777876 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-153a-account-create-bxl99"] Oct 02 09:51:09 crc kubenswrapper[4722]: W1002 09:51:09.784772 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe829993_3dea_4c1c_826f_3d8cea2363e7.slice/crio-e8375a2939a091dbab985510c566b181b65843d6a5709183c4124c6567c4edae WatchSource:0}: Error finding container e8375a2939a091dbab985510c566b181b65843d6a5709183c4124c6567c4edae: Status 404 returned error can't find the container with id e8375a2939a091dbab985510c566b181b65843d6a5709183c4124c6567c4edae Oct 02 09:51:10 crc kubenswrapper[4722]: I1002 09:51:10.195691 4722 generic.go:334] "Generic (PLEG): container finished" podID="fe829993-3dea-4c1c-826f-3d8cea2363e7" containerID="27a9f8b1e9825b0301e16a4e556b1b9fca0c6d678b5b0f4d2492bcfc055714d4" exitCode=0 Oct 02 09:51:10 crc kubenswrapper[4722]: I1002 09:51:10.195845 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-153a-account-create-bxl99" event={"ID":"fe829993-3dea-4c1c-826f-3d8cea2363e7","Type":"ContainerDied","Data":"27a9f8b1e9825b0301e16a4e556b1b9fca0c6d678b5b0f4d2492bcfc055714d4"} Oct 02 09:51:10 crc kubenswrapper[4722]: I1002 09:51:10.196055 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-153a-account-create-bxl99" event={"ID":"fe829993-3dea-4c1c-826f-3d8cea2363e7","Type":"ContainerStarted","Data":"e8375a2939a091dbab985510c566b181b65843d6a5709183c4124c6567c4edae"} Oct 02 09:51:11 crc kubenswrapper[4722]: I1002 09:51:11.472214 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-153a-account-create-bxl99" Oct 02 09:51:11 crc kubenswrapper[4722]: I1002 09:51:11.598799 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-br5jb\" (UniqueName: \"kubernetes.io/projected/fe829993-3dea-4c1c-826f-3d8cea2363e7-kube-api-access-br5jb\") pod \"fe829993-3dea-4c1c-826f-3d8cea2363e7\" (UID: \"fe829993-3dea-4c1c-826f-3d8cea2363e7\") " Oct 02 09:51:11 crc kubenswrapper[4722]: I1002 09:51:11.609047 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe829993-3dea-4c1c-826f-3d8cea2363e7-kube-api-access-br5jb" (OuterVolumeSpecName: "kube-api-access-br5jb") pod "fe829993-3dea-4c1c-826f-3d8cea2363e7" (UID: "fe829993-3dea-4c1c-826f-3d8cea2363e7"). InnerVolumeSpecName "kube-api-access-br5jb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:51:11 crc kubenswrapper[4722]: I1002 09:51:11.700847 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-br5jb\" (UniqueName: \"kubernetes.io/projected/fe829993-3dea-4c1c-826f-3d8cea2363e7-kube-api-access-br5jb\") on node \"crc\" DevicePath \"\"" Oct 02 09:51:12 crc kubenswrapper[4722]: I1002 09:51:12.210741 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-153a-account-create-bxl99" event={"ID":"fe829993-3dea-4c1c-826f-3d8cea2363e7","Type":"ContainerDied","Data":"e8375a2939a091dbab985510c566b181b65843d6a5709183c4124c6567c4edae"} Oct 02 09:51:12 crc kubenswrapper[4722]: I1002 09:51:12.210799 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8375a2939a091dbab985510c566b181b65843d6a5709183c4124c6567c4edae" Oct 02 09:51:12 crc kubenswrapper[4722]: I1002 09:51:12.210921 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-153a-account-create-bxl99" Oct 02 09:51:14 crc kubenswrapper[4722]: I1002 09:51:14.509747 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-zh7g9"] Oct 02 09:51:14 crc kubenswrapper[4722]: E1002 09:51:14.510889 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe829993-3dea-4c1c-826f-3d8cea2363e7" containerName="mariadb-account-create" Oct 02 09:51:14 crc kubenswrapper[4722]: I1002 09:51:14.510907 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe829993-3dea-4c1c-826f-3d8cea2363e7" containerName="mariadb-account-create" Oct 02 09:51:14 crc kubenswrapper[4722]: I1002 09:51:14.511127 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe829993-3dea-4c1c-826f-3d8cea2363e7" containerName="mariadb-account-create" Oct 02 09:51:14 crc kubenswrapper[4722]: I1002 09:51:14.511889 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-sync-zh7g9" Oct 02 09:51:14 crc kubenswrapper[4722]: I1002 09:51:14.513870 4722 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-barbican-dockercfg-cs9cf" Oct 02 09:51:14 crc kubenswrapper[4722]: I1002 09:51:14.515031 4722 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-config-data" Oct 02 09:51:14 crc kubenswrapper[4722]: I1002 09:51:14.517609 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-zh7g9"] Oct 02 09:51:14 crc kubenswrapper[4722]: I1002 09:51:14.649801 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lhsc\" (UniqueName: \"kubernetes.io/projected/f31283d3-960a-4d9e-809f-6c98df068e39-kube-api-access-2lhsc\") pod \"barbican-db-sync-zh7g9\" (UID: \"f31283d3-960a-4d9e-809f-6c98df068e39\") " pod="barbican-kuttl-tests/barbican-db-sync-zh7g9" Oct 02 09:51:14 crc kubenswrapper[4722]: I1002 09:51:14.650205 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f31283d3-960a-4d9e-809f-6c98df068e39-db-sync-config-data\") pod \"barbican-db-sync-zh7g9\" (UID: \"f31283d3-960a-4d9e-809f-6c98df068e39\") " pod="barbican-kuttl-tests/barbican-db-sync-zh7g9" Oct 02 09:51:14 crc kubenswrapper[4722]: I1002 09:51:14.751581 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f31283d3-960a-4d9e-809f-6c98df068e39-db-sync-config-data\") pod \"barbican-db-sync-zh7g9\" (UID: \"f31283d3-960a-4d9e-809f-6c98df068e39\") " pod="barbican-kuttl-tests/barbican-db-sync-zh7g9" Oct 02 09:51:14 crc kubenswrapper[4722]: I1002 09:51:14.751789 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lhsc\" (UniqueName: \"kubernetes.io/projected/f31283d3-960a-4d9e-809f-6c98df068e39-kube-api-access-2lhsc\") pod \"barbican-db-sync-zh7g9\" (UID: \"f31283d3-960a-4d9e-809f-6c98df068e39\") " pod="barbican-kuttl-tests/barbican-db-sync-zh7g9" Oct 02 09:51:14 crc kubenswrapper[4722]: I1002 09:51:14.763574 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f31283d3-960a-4d9e-809f-6c98df068e39-db-sync-config-data\") pod \"barbican-db-sync-zh7g9\" (UID: \"f31283d3-960a-4d9e-809f-6c98df068e39\") " pod="barbican-kuttl-tests/barbican-db-sync-zh7g9" Oct 02 09:51:14 crc kubenswrapper[4722]: I1002 09:51:14.779577 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lhsc\" (UniqueName: \"kubernetes.io/projected/f31283d3-960a-4d9e-809f-6c98df068e39-kube-api-access-2lhsc\") pod \"barbican-db-sync-zh7g9\" (UID: \"f31283d3-960a-4d9e-809f-6c98df068e39\") " pod="barbican-kuttl-tests/barbican-db-sync-zh7g9" Oct 02 09:51:14 crc kubenswrapper[4722]: I1002 09:51:14.827521 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-sync-zh7g9" Oct 02 09:51:15 crc kubenswrapper[4722]: I1002 09:51:15.272706 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-zh7g9"] Oct 02 09:51:16 crc kubenswrapper[4722]: I1002 09:51:16.242191 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-sync-zh7g9" event={"ID":"f31283d3-960a-4d9e-809f-6c98df068e39","Type":"ContainerStarted","Data":"0379e56ac0a3fd46747840fb46c2f9e3beb28b912ddfa227c2c91dc43e251348"} Oct 02 09:51:20 crc kubenswrapper[4722]: I1002 09:51:20.271829 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-sync-zh7g9" event={"ID":"f31283d3-960a-4d9e-809f-6c98df068e39","Type":"ContainerStarted","Data":"e2750b9bfe253288155d81b2d5c0baf1568698fa6c79146ffbbbb7766e0abc8e"} Oct 02 09:51:20 crc kubenswrapper[4722]: I1002 09:51:20.288683 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-db-sync-zh7g9" podStartSLOduration=2.156285867 podStartE2EDuration="6.288652814s" podCreationTimestamp="2025-10-02 09:51:14 +0000 UTC" firstStartedPulling="2025-10-02 09:51:15.282423441 +0000 UTC m=+1131.319176421" lastFinishedPulling="2025-10-02 09:51:19.414790388 +0000 UTC m=+1135.451543368" observedRunningTime="2025-10-02 09:51:20.286765367 +0000 UTC m=+1136.323518357" watchObservedRunningTime="2025-10-02 09:51:20.288652814 +0000 UTC m=+1136.325405794" Oct 02 09:51:23 crc kubenswrapper[4722]: I1002 09:51:23.294640 4722 generic.go:334] "Generic (PLEG): container finished" podID="f31283d3-960a-4d9e-809f-6c98df068e39" containerID="e2750b9bfe253288155d81b2d5c0baf1568698fa6c79146ffbbbb7766e0abc8e" exitCode=0 Oct 02 09:51:23 crc kubenswrapper[4722]: I1002 09:51:23.294840 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-sync-zh7g9" event={"ID":"f31283d3-960a-4d9e-809f-6c98df068e39","Type":"ContainerDied","Data":"e2750b9bfe253288155d81b2d5c0baf1568698fa6c79146ffbbbb7766e0abc8e"} Oct 02 09:51:24 crc kubenswrapper[4722]: I1002 09:51:24.551065 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-sync-zh7g9" Oct 02 09:51:24 crc kubenswrapper[4722]: I1002 09:51:24.699572 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lhsc\" (UniqueName: \"kubernetes.io/projected/f31283d3-960a-4d9e-809f-6c98df068e39-kube-api-access-2lhsc\") pod \"f31283d3-960a-4d9e-809f-6c98df068e39\" (UID: \"f31283d3-960a-4d9e-809f-6c98df068e39\") " Oct 02 09:51:24 crc kubenswrapper[4722]: I1002 09:51:24.700013 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f31283d3-960a-4d9e-809f-6c98df068e39-db-sync-config-data\") pod \"f31283d3-960a-4d9e-809f-6c98df068e39\" (UID: \"f31283d3-960a-4d9e-809f-6c98df068e39\") " Oct 02 09:51:24 crc kubenswrapper[4722]: I1002 09:51:24.705442 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f31283d3-960a-4d9e-809f-6c98df068e39-kube-api-access-2lhsc" (OuterVolumeSpecName: "kube-api-access-2lhsc") pod "f31283d3-960a-4d9e-809f-6c98df068e39" (UID: "f31283d3-960a-4d9e-809f-6c98df068e39"). InnerVolumeSpecName "kube-api-access-2lhsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:51:24 crc kubenswrapper[4722]: I1002 09:51:24.705572 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f31283d3-960a-4d9e-809f-6c98df068e39-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "f31283d3-960a-4d9e-809f-6c98df068e39" (UID: "f31283d3-960a-4d9e-809f-6c98df068e39"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:51:24 crc kubenswrapper[4722]: I1002 09:51:24.803386 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lhsc\" (UniqueName: \"kubernetes.io/projected/f31283d3-960a-4d9e-809f-6c98df068e39-kube-api-access-2lhsc\") on node \"crc\" DevicePath \"\"" Oct 02 09:51:24 crc kubenswrapper[4722]: I1002 09:51:24.803426 4722 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f31283d3-960a-4d9e-809f-6c98df068e39-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 09:51:24 crc kubenswrapper[4722]: I1002 09:51:24.992098 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="barbican-kuttl-tests/keystone-54b885497-t8c9k" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.312173 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-sync-zh7g9" event={"ID":"f31283d3-960a-4d9e-809f-6c98df068e39","Type":"ContainerDied","Data":"0379e56ac0a3fd46747840fb46c2f9e3beb28b912ddfa227c2c91dc43e251348"} Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.312278 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0379e56ac0a3fd46747840fb46c2f9e3beb28b912ddfa227c2c91dc43e251348" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.312220 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-sync-zh7g9" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.542863 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-worker-746c8577-5hmvk"] Oct 02 09:51:25 crc kubenswrapper[4722]: E1002 09:51:25.543212 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f31283d3-960a-4d9e-809f-6c98df068e39" containerName="barbican-db-sync" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.543231 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f31283d3-960a-4d9e-809f-6c98df068e39" containerName="barbican-db-sync" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.543388 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f31283d3-960a-4d9e-809f-6c98df068e39" containerName="barbican-db-sync" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.544252 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-746c8577-5hmvk" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.547251 4722 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-worker-config-data" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.547485 4722 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-barbican-dockercfg-cs9cf" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.547688 4722 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-config-data" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.562472 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-7db7564864-jnbnb"] Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.563793 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-jnbnb" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.565945 4722 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-keystone-listener-config-data" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.572133 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-worker-746c8577-5hmvk"] Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.594470 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-7db7564864-jnbnb"] Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.699676 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-api-dd979cfb4-dszx2"] Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.701361 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-dd979cfb4-dszx2" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.703869 4722 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-api-config-data" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.714233 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-api-dd979cfb4-dszx2"] Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.717175 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/653b7a1c-d895-4141-a71d-b50a28085148-config-data\") pod \"barbican-worker-746c8577-5hmvk\" (UID: \"653b7a1c-d895-4141-a71d-b50a28085148\") " pod="barbican-kuttl-tests/barbican-worker-746c8577-5hmvk" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.717212 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b995eb06-9120-4c50-8c99-16f42293107b-logs\") pod \"barbican-keystone-listener-7db7564864-jnbnb\" (UID: \"b995eb06-9120-4c50-8c99-16f42293107b\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-jnbnb" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.717234 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kmgz\" (UniqueName: \"kubernetes.io/projected/653b7a1c-d895-4141-a71d-b50a28085148-kube-api-access-2kmgz\") pod \"barbican-worker-746c8577-5hmvk\" (UID: \"653b7a1c-d895-4141-a71d-b50a28085148\") " pod="barbican-kuttl-tests/barbican-worker-746c8577-5hmvk" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.717260 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b995eb06-9120-4c50-8c99-16f42293107b-config-data\") pod \"barbican-keystone-listener-7db7564864-jnbnb\" (UID: \"b995eb06-9120-4c50-8c99-16f42293107b\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-jnbnb" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.717298 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcprw\" (UniqueName: \"kubernetes.io/projected/b995eb06-9120-4c50-8c99-16f42293107b-kube-api-access-qcprw\") pod \"barbican-keystone-listener-7db7564864-jnbnb\" (UID: \"b995eb06-9120-4c50-8c99-16f42293107b\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-jnbnb" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.717366 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/653b7a1c-d895-4141-a71d-b50a28085148-config-data-custom\") pod \"barbican-worker-746c8577-5hmvk\" (UID: \"653b7a1c-d895-4141-a71d-b50a28085148\") " pod="barbican-kuttl-tests/barbican-worker-746c8577-5hmvk" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.717463 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b995eb06-9120-4c50-8c99-16f42293107b-config-data-custom\") pod \"barbican-keystone-listener-7db7564864-jnbnb\" (UID: \"b995eb06-9120-4c50-8c99-16f42293107b\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-jnbnb" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.717493 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/653b7a1c-d895-4141-a71d-b50a28085148-logs\") pod \"barbican-worker-746c8577-5hmvk\" (UID: \"653b7a1c-d895-4141-a71d-b50a28085148\") " pod="barbican-kuttl-tests/barbican-worker-746c8577-5hmvk" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.818484 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da8316eb-4db3-4ca4-a64b-3ed59814a0b8-logs\") pod \"barbican-api-dd979cfb4-dszx2\" (UID: \"da8316eb-4db3-4ca4-a64b-3ed59814a0b8\") " pod="barbican-kuttl-tests/barbican-api-dd979cfb4-dszx2" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.818595 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/653b7a1c-d895-4141-a71d-b50a28085148-config-data\") pod \"barbican-worker-746c8577-5hmvk\" (UID: \"653b7a1c-d895-4141-a71d-b50a28085148\") " pod="barbican-kuttl-tests/barbican-worker-746c8577-5hmvk" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.819486 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b995eb06-9120-4c50-8c99-16f42293107b-logs\") pod \"barbican-keystone-listener-7db7564864-jnbnb\" (UID: \"b995eb06-9120-4c50-8c99-16f42293107b\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-jnbnb" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.819527 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kmgz\" (UniqueName: \"kubernetes.io/projected/653b7a1c-d895-4141-a71d-b50a28085148-kube-api-access-2kmgz\") pod \"barbican-worker-746c8577-5hmvk\" (UID: \"653b7a1c-d895-4141-a71d-b50a28085148\") " pod="barbican-kuttl-tests/barbican-worker-746c8577-5hmvk" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.819558 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b995eb06-9120-4c50-8c99-16f42293107b-config-data\") pod \"barbican-keystone-listener-7db7564864-jnbnb\" (UID: \"b995eb06-9120-4c50-8c99-16f42293107b\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-jnbnb" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.819602 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcprw\" (UniqueName: \"kubernetes.io/projected/b995eb06-9120-4c50-8c99-16f42293107b-kube-api-access-qcprw\") pod \"barbican-keystone-listener-7db7564864-jnbnb\" (UID: \"b995eb06-9120-4c50-8c99-16f42293107b\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-jnbnb" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.819636 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da8316eb-4db3-4ca4-a64b-3ed59814a0b8-config-data-custom\") pod \"barbican-api-dd979cfb4-dszx2\" (UID: \"da8316eb-4db3-4ca4-a64b-3ed59814a0b8\") " pod="barbican-kuttl-tests/barbican-api-dd979cfb4-dszx2" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.819672 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/653b7a1c-d895-4141-a71d-b50a28085148-config-data-custom\") pod \"barbican-worker-746c8577-5hmvk\" (UID: \"653b7a1c-d895-4141-a71d-b50a28085148\") " pod="barbican-kuttl-tests/barbican-worker-746c8577-5hmvk" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.819737 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da8316eb-4db3-4ca4-a64b-3ed59814a0b8-config-data\") pod \"barbican-api-dd979cfb4-dszx2\" (UID: \"da8316eb-4db3-4ca4-a64b-3ed59814a0b8\") " pod="barbican-kuttl-tests/barbican-api-dd979cfb4-dszx2" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.819803 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr645\" (UniqueName: \"kubernetes.io/projected/da8316eb-4db3-4ca4-a64b-3ed59814a0b8-kube-api-access-jr645\") pod \"barbican-api-dd979cfb4-dszx2\" (UID: \"da8316eb-4db3-4ca4-a64b-3ed59814a0b8\") " pod="barbican-kuttl-tests/barbican-api-dd979cfb4-dszx2" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.819850 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b995eb06-9120-4c50-8c99-16f42293107b-config-data-custom\") pod \"barbican-keystone-listener-7db7564864-jnbnb\" (UID: \"b995eb06-9120-4c50-8c99-16f42293107b\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-jnbnb" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.819890 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/653b7a1c-d895-4141-a71d-b50a28085148-logs\") pod \"barbican-worker-746c8577-5hmvk\" (UID: \"653b7a1c-d895-4141-a71d-b50a28085148\") " pod="barbican-kuttl-tests/barbican-worker-746c8577-5hmvk" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.819847 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b995eb06-9120-4c50-8c99-16f42293107b-logs\") pod \"barbican-keystone-listener-7db7564864-jnbnb\" (UID: \"b995eb06-9120-4c50-8c99-16f42293107b\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-jnbnb" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.820354 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/653b7a1c-d895-4141-a71d-b50a28085148-logs\") pod \"barbican-worker-746c8577-5hmvk\" (UID: \"653b7a1c-d895-4141-a71d-b50a28085148\") " pod="barbican-kuttl-tests/barbican-worker-746c8577-5hmvk" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.826414 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/653b7a1c-d895-4141-a71d-b50a28085148-config-data\") pod \"barbican-worker-746c8577-5hmvk\" (UID: \"653b7a1c-d895-4141-a71d-b50a28085148\") " pod="barbican-kuttl-tests/barbican-worker-746c8577-5hmvk" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.831519 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b995eb06-9120-4c50-8c99-16f42293107b-config-data\") pod \"barbican-keystone-listener-7db7564864-jnbnb\" (UID: \"b995eb06-9120-4c50-8c99-16f42293107b\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-jnbnb" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.837590 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kmgz\" (UniqueName: \"kubernetes.io/projected/653b7a1c-d895-4141-a71d-b50a28085148-kube-api-access-2kmgz\") pod \"barbican-worker-746c8577-5hmvk\" (UID: \"653b7a1c-d895-4141-a71d-b50a28085148\") " pod="barbican-kuttl-tests/barbican-worker-746c8577-5hmvk" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.838011 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcprw\" (UniqueName: \"kubernetes.io/projected/b995eb06-9120-4c50-8c99-16f42293107b-kube-api-access-qcprw\") pod \"barbican-keystone-listener-7db7564864-jnbnb\" (UID: \"b995eb06-9120-4c50-8c99-16f42293107b\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-jnbnb" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.838531 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b995eb06-9120-4c50-8c99-16f42293107b-config-data-custom\") pod \"barbican-keystone-listener-7db7564864-jnbnb\" (UID: \"b995eb06-9120-4c50-8c99-16f42293107b\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-jnbnb" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.848781 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/653b7a1c-d895-4141-a71d-b50a28085148-config-data-custom\") pod \"barbican-worker-746c8577-5hmvk\" (UID: \"653b7a1c-d895-4141-a71d-b50a28085148\") " pod="barbican-kuttl-tests/barbican-worker-746c8577-5hmvk" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.860871 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-746c8577-5hmvk" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.883269 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-jnbnb" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.921296 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da8316eb-4db3-4ca4-a64b-3ed59814a0b8-config-data-custom\") pod \"barbican-api-dd979cfb4-dszx2\" (UID: \"da8316eb-4db3-4ca4-a64b-3ed59814a0b8\") " pod="barbican-kuttl-tests/barbican-api-dd979cfb4-dszx2" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.921400 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da8316eb-4db3-4ca4-a64b-3ed59814a0b8-config-data\") pod \"barbican-api-dd979cfb4-dszx2\" (UID: \"da8316eb-4db3-4ca4-a64b-3ed59814a0b8\") " pod="barbican-kuttl-tests/barbican-api-dd979cfb4-dszx2" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.921438 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jr645\" (UniqueName: \"kubernetes.io/projected/da8316eb-4db3-4ca4-a64b-3ed59814a0b8-kube-api-access-jr645\") pod \"barbican-api-dd979cfb4-dszx2\" (UID: \"da8316eb-4db3-4ca4-a64b-3ed59814a0b8\") " pod="barbican-kuttl-tests/barbican-api-dd979cfb4-dszx2" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.921483 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da8316eb-4db3-4ca4-a64b-3ed59814a0b8-logs\") pod \"barbican-api-dd979cfb4-dszx2\" (UID: \"da8316eb-4db3-4ca4-a64b-3ed59814a0b8\") " pod="barbican-kuttl-tests/barbican-api-dd979cfb4-dszx2" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.921994 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da8316eb-4db3-4ca4-a64b-3ed59814a0b8-logs\") pod \"barbican-api-dd979cfb4-dszx2\" (UID: \"da8316eb-4db3-4ca4-a64b-3ed59814a0b8\") " pod="barbican-kuttl-tests/barbican-api-dd979cfb4-dszx2" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.926003 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da8316eb-4db3-4ca4-a64b-3ed59814a0b8-config-data\") pod \"barbican-api-dd979cfb4-dszx2\" (UID: \"da8316eb-4db3-4ca4-a64b-3ed59814a0b8\") " pod="barbican-kuttl-tests/barbican-api-dd979cfb4-dszx2" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.929564 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da8316eb-4db3-4ca4-a64b-3ed59814a0b8-config-data-custom\") pod \"barbican-api-dd979cfb4-dszx2\" (UID: \"da8316eb-4db3-4ca4-a64b-3ed59814a0b8\") " pod="barbican-kuttl-tests/barbican-api-dd979cfb4-dszx2" Oct 02 09:51:25 crc kubenswrapper[4722]: I1002 09:51:25.944081 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr645\" (UniqueName: \"kubernetes.io/projected/da8316eb-4db3-4ca4-a64b-3ed59814a0b8-kube-api-access-jr645\") pod \"barbican-api-dd979cfb4-dszx2\" (UID: \"da8316eb-4db3-4ca4-a64b-3ed59814a0b8\") " pod="barbican-kuttl-tests/barbican-api-dd979cfb4-dszx2" Oct 02 09:51:26 crc kubenswrapper[4722]: I1002 09:51:26.018896 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-dd979cfb4-dszx2" Oct 02 09:51:26 crc kubenswrapper[4722]: I1002 09:51:26.101693 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-worker-746c8577-5hmvk"] Oct 02 09:51:26 crc kubenswrapper[4722]: W1002 09:51:26.111095 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod653b7a1c_d895_4141_a71d_b50a28085148.slice/crio-67693f198c8ada2b2796d7514f65d168feec8b8a45897bcc16e6944d189fbbd8 WatchSource:0}: Error finding container 67693f198c8ada2b2796d7514f65d168feec8b8a45897bcc16e6944d189fbbd8: Status 404 returned error can't find the container with id 67693f198c8ada2b2796d7514f65d168feec8b8a45897bcc16e6944d189fbbd8 Oct 02 09:51:26 crc kubenswrapper[4722]: I1002 09:51:26.319120 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-746c8577-5hmvk" event={"ID":"653b7a1c-d895-4141-a71d-b50a28085148","Type":"ContainerStarted","Data":"67693f198c8ada2b2796d7514f65d168feec8b8a45897bcc16e6944d189fbbd8"} Oct 02 09:51:26 crc kubenswrapper[4722]: I1002 09:51:26.358880 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-7db7564864-jnbnb"] Oct 02 09:51:26 crc kubenswrapper[4722]: W1002 09:51:26.363102 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb995eb06_9120_4c50_8c99_16f42293107b.slice/crio-578c5648d404ee8e6a39a2f1b40fa818c862f244f0144c59bf22edb2e37ea9d7 WatchSource:0}: Error finding container 578c5648d404ee8e6a39a2f1b40fa818c862f244f0144c59bf22edb2e37ea9d7: Status 404 returned error can't find the container with id 578c5648d404ee8e6a39a2f1b40fa818c862f244f0144c59bf22edb2e37ea9d7 Oct 02 09:51:26 crc kubenswrapper[4722]: I1002 09:51:26.440378 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-api-dd979cfb4-dszx2"] Oct 02 09:51:26 crc kubenswrapper[4722]: W1002 09:51:26.446278 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda8316eb_4db3_4ca4_a64b_3ed59814a0b8.slice/crio-505281987dfe87ea5827d5ee0a6d204ed08930facc95e64d067f345082f49e32 WatchSource:0}: Error finding container 505281987dfe87ea5827d5ee0a6d204ed08930facc95e64d067f345082f49e32: Status 404 returned error can't find the container with id 505281987dfe87ea5827d5ee0a6d204ed08930facc95e64d067f345082f49e32 Oct 02 09:51:26 crc kubenswrapper[4722]: I1002 09:51:26.673122 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-api-dd979cfb4-ppq8f"] Oct 02 09:51:26 crc kubenswrapper[4722]: I1002 09:51:26.680226 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-dd979cfb4-ppq8f" Oct 02 09:51:26 crc kubenswrapper[4722]: I1002 09:51:26.687983 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-api-dd979cfb4-ppq8f"] Oct 02 09:51:26 crc kubenswrapper[4722]: I1002 09:51:26.836557 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3603a0b3-7883-43ad-a297-d782d6afa067-logs\") pod \"barbican-api-dd979cfb4-ppq8f\" (UID: \"3603a0b3-7883-43ad-a297-d782d6afa067\") " pod="barbican-kuttl-tests/barbican-api-dd979cfb4-ppq8f" Oct 02 09:51:26 crc kubenswrapper[4722]: I1002 09:51:26.836749 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3603a0b3-7883-43ad-a297-d782d6afa067-config-data-custom\") pod \"barbican-api-dd979cfb4-ppq8f\" (UID: \"3603a0b3-7883-43ad-a297-d782d6afa067\") " pod="barbican-kuttl-tests/barbican-api-dd979cfb4-ppq8f" Oct 02 09:51:26 crc kubenswrapper[4722]: I1002 09:51:26.836795 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3603a0b3-7883-43ad-a297-d782d6afa067-config-data\") pod \"barbican-api-dd979cfb4-ppq8f\" (UID: \"3603a0b3-7883-43ad-a297-d782d6afa067\") " pod="barbican-kuttl-tests/barbican-api-dd979cfb4-ppq8f" Oct 02 09:51:26 crc kubenswrapper[4722]: I1002 09:51:26.836872 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7dwd\" (UniqueName: \"kubernetes.io/projected/3603a0b3-7883-43ad-a297-d782d6afa067-kube-api-access-t7dwd\") pod \"barbican-api-dd979cfb4-ppq8f\" (UID: \"3603a0b3-7883-43ad-a297-d782d6afa067\") " pod="barbican-kuttl-tests/barbican-api-dd979cfb4-ppq8f" Oct 02 09:51:26 crc kubenswrapper[4722]: I1002 09:51:26.881481 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-7db7564864-mqwgz"] Oct 02 09:51:26 crc kubenswrapper[4722]: I1002 09:51:26.882999 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-mqwgz" Oct 02 09:51:26 crc kubenswrapper[4722]: I1002 09:51:26.899685 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-7db7564864-mqwgz"] Oct 02 09:51:26 crc kubenswrapper[4722]: I1002 09:51:26.938041 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3603a0b3-7883-43ad-a297-d782d6afa067-config-data-custom\") pod \"barbican-api-dd979cfb4-ppq8f\" (UID: \"3603a0b3-7883-43ad-a297-d782d6afa067\") " pod="barbican-kuttl-tests/barbican-api-dd979cfb4-ppq8f" Oct 02 09:51:26 crc kubenswrapper[4722]: I1002 09:51:26.938126 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3603a0b3-7883-43ad-a297-d782d6afa067-config-data\") pod \"barbican-api-dd979cfb4-ppq8f\" (UID: \"3603a0b3-7883-43ad-a297-d782d6afa067\") " pod="barbican-kuttl-tests/barbican-api-dd979cfb4-ppq8f" Oct 02 09:51:26 crc kubenswrapper[4722]: I1002 09:51:26.938189 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7dwd\" (UniqueName: \"kubernetes.io/projected/3603a0b3-7883-43ad-a297-d782d6afa067-kube-api-access-t7dwd\") pod \"barbican-api-dd979cfb4-ppq8f\" (UID: \"3603a0b3-7883-43ad-a297-d782d6afa067\") " pod="barbican-kuttl-tests/barbican-api-dd979cfb4-ppq8f" Oct 02 09:51:26 crc kubenswrapper[4722]: I1002 09:51:26.938237 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3603a0b3-7883-43ad-a297-d782d6afa067-logs\") pod \"barbican-api-dd979cfb4-ppq8f\" (UID: \"3603a0b3-7883-43ad-a297-d782d6afa067\") " pod="barbican-kuttl-tests/barbican-api-dd979cfb4-ppq8f" Oct 02 09:51:26 crc kubenswrapper[4722]: I1002 09:51:26.939272 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3603a0b3-7883-43ad-a297-d782d6afa067-logs\") pod \"barbican-api-dd979cfb4-ppq8f\" (UID: \"3603a0b3-7883-43ad-a297-d782d6afa067\") " pod="barbican-kuttl-tests/barbican-api-dd979cfb4-ppq8f" Oct 02 09:51:26 crc kubenswrapper[4722]: I1002 09:51:26.944224 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3603a0b3-7883-43ad-a297-d782d6afa067-config-data\") pod \"barbican-api-dd979cfb4-ppq8f\" (UID: \"3603a0b3-7883-43ad-a297-d782d6afa067\") " pod="barbican-kuttl-tests/barbican-api-dd979cfb4-ppq8f" Oct 02 09:51:26 crc kubenswrapper[4722]: I1002 09:51:26.944853 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3603a0b3-7883-43ad-a297-d782d6afa067-config-data-custom\") pod \"barbican-api-dd979cfb4-ppq8f\" (UID: \"3603a0b3-7883-43ad-a297-d782d6afa067\") " pod="barbican-kuttl-tests/barbican-api-dd979cfb4-ppq8f" Oct 02 09:51:26 crc kubenswrapper[4722]: I1002 09:51:26.985927 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7dwd\" (UniqueName: \"kubernetes.io/projected/3603a0b3-7883-43ad-a297-d782d6afa067-kube-api-access-t7dwd\") pod \"barbican-api-dd979cfb4-ppq8f\" (UID: \"3603a0b3-7883-43ad-a297-d782d6afa067\") " pod="barbican-kuttl-tests/barbican-api-dd979cfb4-ppq8f" Oct 02 09:51:27 crc kubenswrapper[4722]: I1002 09:51:27.036720 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-dd979cfb4-ppq8f" Oct 02 09:51:27 crc kubenswrapper[4722]: I1002 09:51:27.040649 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/837089fe-a05d-42fb-913c-41a52ed8e2a5-config-data\") pod \"barbican-keystone-listener-7db7564864-mqwgz\" (UID: \"837089fe-a05d-42fb-913c-41a52ed8e2a5\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-mqwgz" Oct 02 09:51:27 crc kubenswrapper[4722]: I1002 09:51:27.040755 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/837089fe-a05d-42fb-913c-41a52ed8e2a5-config-data-custom\") pod \"barbican-keystone-listener-7db7564864-mqwgz\" (UID: \"837089fe-a05d-42fb-913c-41a52ed8e2a5\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-mqwgz" Oct 02 09:51:27 crc kubenswrapper[4722]: I1002 09:51:27.040816 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/837089fe-a05d-42fb-913c-41a52ed8e2a5-logs\") pod \"barbican-keystone-listener-7db7564864-mqwgz\" (UID: \"837089fe-a05d-42fb-913c-41a52ed8e2a5\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-mqwgz" Oct 02 09:51:27 crc kubenswrapper[4722]: I1002 09:51:27.040846 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nh7m\" (UniqueName: \"kubernetes.io/projected/837089fe-a05d-42fb-913c-41a52ed8e2a5-kube-api-access-7nh7m\") pod \"barbican-keystone-listener-7db7564864-mqwgz\" (UID: \"837089fe-a05d-42fb-913c-41a52ed8e2a5\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-mqwgz" Oct 02 09:51:27 crc kubenswrapper[4722]: I1002 09:51:27.142634 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/837089fe-a05d-42fb-913c-41a52ed8e2a5-config-data-custom\") pod \"barbican-keystone-listener-7db7564864-mqwgz\" (UID: \"837089fe-a05d-42fb-913c-41a52ed8e2a5\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-mqwgz" Oct 02 09:51:27 crc kubenswrapper[4722]: I1002 09:51:27.142752 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/837089fe-a05d-42fb-913c-41a52ed8e2a5-logs\") pod \"barbican-keystone-listener-7db7564864-mqwgz\" (UID: \"837089fe-a05d-42fb-913c-41a52ed8e2a5\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-mqwgz" Oct 02 09:51:27 crc kubenswrapper[4722]: I1002 09:51:27.142787 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nh7m\" (UniqueName: \"kubernetes.io/projected/837089fe-a05d-42fb-913c-41a52ed8e2a5-kube-api-access-7nh7m\") pod \"barbican-keystone-listener-7db7564864-mqwgz\" (UID: \"837089fe-a05d-42fb-913c-41a52ed8e2a5\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-mqwgz" Oct 02 09:51:27 crc kubenswrapper[4722]: I1002 09:51:27.142864 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/837089fe-a05d-42fb-913c-41a52ed8e2a5-config-data\") pod \"barbican-keystone-listener-7db7564864-mqwgz\" (UID: \"837089fe-a05d-42fb-913c-41a52ed8e2a5\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-mqwgz" Oct 02 09:51:27 crc kubenswrapper[4722]: I1002 09:51:27.144041 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/837089fe-a05d-42fb-913c-41a52ed8e2a5-logs\") pod \"barbican-keystone-listener-7db7564864-mqwgz\" (UID: \"837089fe-a05d-42fb-913c-41a52ed8e2a5\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-mqwgz" Oct 02 09:51:27 crc kubenswrapper[4722]: I1002 09:51:27.151226 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/837089fe-a05d-42fb-913c-41a52ed8e2a5-config-data-custom\") pod \"barbican-keystone-listener-7db7564864-mqwgz\" (UID: \"837089fe-a05d-42fb-913c-41a52ed8e2a5\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-mqwgz" Oct 02 09:51:27 crc kubenswrapper[4722]: I1002 09:51:27.152753 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/837089fe-a05d-42fb-913c-41a52ed8e2a5-config-data\") pod \"barbican-keystone-listener-7db7564864-mqwgz\" (UID: \"837089fe-a05d-42fb-913c-41a52ed8e2a5\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-mqwgz" Oct 02 09:51:27 crc kubenswrapper[4722]: I1002 09:51:27.163253 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-worker-746c8577-jwnfm"] Oct 02 09:51:27 crc kubenswrapper[4722]: I1002 09:51:27.164786 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-746c8577-jwnfm" Oct 02 09:51:27 crc kubenswrapper[4722]: I1002 09:51:27.181820 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-worker-746c8577-jwnfm"] Oct 02 09:51:27 crc kubenswrapper[4722]: I1002 09:51:27.203412 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nh7m\" (UniqueName: \"kubernetes.io/projected/837089fe-a05d-42fb-913c-41a52ed8e2a5-kube-api-access-7nh7m\") pod \"barbican-keystone-listener-7db7564864-mqwgz\" (UID: \"837089fe-a05d-42fb-913c-41a52ed8e2a5\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-mqwgz" Oct 02 09:51:27 crc kubenswrapper[4722]: I1002 09:51:27.344147 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-jnbnb" event={"ID":"b995eb06-9120-4c50-8c99-16f42293107b","Type":"ContainerStarted","Data":"578c5648d404ee8e6a39a2f1b40fa818c862f244f0144c59bf22edb2e37ea9d7"} Oct 02 09:51:27 crc kubenswrapper[4722]: I1002 09:51:27.344590 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2627bdae-7974-4dc6-804d-bde2b54c4b3f-config-data\") pod \"barbican-worker-746c8577-jwnfm\" (UID: \"2627bdae-7974-4dc6-804d-bde2b54c4b3f\") " pod="barbican-kuttl-tests/barbican-worker-746c8577-jwnfm" Oct 02 09:51:27 crc kubenswrapper[4722]: I1002 09:51:27.344715 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2627bdae-7974-4dc6-804d-bde2b54c4b3f-logs\") pod \"barbican-worker-746c8577-jwnfm\" (UID: \"2627bdae-7974-4dc6-804d-bde2b54c4b3f\") " pod="barbican-kuttl-tests/barbican-worker-746c8577-jwnfm" Oct 02 09:51:27 crc kubenswrapper[4722]: I1002 09:51:27.344784 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2627bdae-7974-4dc6-804d-bde2b54c4b3f-config-data-custom\") pod \"barbican-worker-746c8577-jwnfm\" (UID: \"2627bdae-7974-4dc6-804d-bde2b54c4b3f\") " pod="barbican-kuttl-tests/barbican-worker-746c8577-jwnfm" Oct 02 09:51:27 crc kubenswrapper[4722]: I1002 09:51:27.344809 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxz8w\" (UniqueName: \"kubernetes.io/projected/2627bdae-7974-4dc6-804d-bde2b54c4b3f-kube-api-access-qxz8w\") pod \"barbican-worker-746c8577-jwnfm\" (UID: \"2627bdae-7974-4dc6-804d-bde2b54c4b3f\") " pod="barbican-kuttl-tests/barbican-worker-746c8577-jwnfm" Oct 02 09:51:27 crc kubenswrapper[4722]: I1002 09:51:27.345970 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-dd979cfb4-dszx2" event={"ID":"da8316eb-4db3-4ca4-a64b-3ed59814a0b8","Type":"ContainerStarted","Data":"adf597397b6e96585ffa917145f078bc55467cee7170a6f2e1e737b930204ee1"} Oct 02 09:51:27 crc kubenswrapper[4722]: I1002 09:51:27.346004 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-dd979cfb4-dszx2" event={"ID":"da8316eb-4db3-4ca4-a64b-3ed59814a0b8","Type":"ContainerStarted","Data":"5494dcf8c5d45261e20f33df9e3f498e3bfff47d1d99fecaf41ebe3f4483d42c"} Oct 02 09:51:27 crc kubenswrapper[4722]: I1002 09:51:27.346018 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-dd979cfb4-dszx2" event={"ID":"da8316eb-4db3-4ca4-a64b-3ed59814a0b8","Type":"ContainerStarted","Data":"505281987dfe87ea5827d5ee0a6d204ed08930facc95e64d067f345082f49e32"} Oct 02 09:51:27 crc kubenswrapper[4722]: I1002 09:51:27.347003 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/barbican-api-dd979cfb4-dszx2" Oct 02 09:51:27 crc kubenswrapper[4722]: I1002 09:51:27.347030 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/barbican-api-dd979cfb4-dszx2" Oct 02 09:51:27 crc kubenswrapper[4722]: I1002 09:51:27.376472 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-api-dd979cfb4-dszx2" podStartSLOduration=2.376439923 podStartE2EDuration="2.376439923s" podCreationTimestamp="2025-10-02 09:51:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:51:27.369024859 +0000 UTC m=+1143.405777859" watchObservedRunningTime="2025-10-02 09:51:27.376439923 +0000 UTC m=+1143.413192903" Oct 02 09:51:27 crc kubenswrapper[4722]: I1002 09:51:27.415141 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-api-dd979cfb4-ppq8f"] Oct 02 09:51:27 crc kubenswrapper[4722]: I1002 09:51:27.446504 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2627bdae-7974-4dc6-804d-bde2b54c4b3f-config-data\") pod \"barbican-worker-746c8577-jwnfm\" (UID: \"2627bdae-7974-4dc6-804d-bde2b54c4b3f\") " pod="barbican-kuttl-tests/barbican-worker-746c8577-jwnfm" Oct 02 09:51:27 crc kubenswrapper[4722]: I1002 09:51:27.446648 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2627bdae-7974-4dc6-804d-bde2b54c4b3f-logs\") pod \"barbican-worker-746c8577-jwnfm\" (UID: \"2627bdae-7974-4dc6-804d-bde2b54c4b3f\") " pod="barbican-kuttl-tests/barbican-worker-746c8577-jwnfm" Oct 02 09:51:27 crc kubenswrapper[4722]: I1002 09:51:27.446678 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2627bdae-7974-4dc6-804d-bde2b54c4b3f-config-data-custom\") pod \"barbican-worker-746c8577-jwnfm\" (UID: \"2627bdae-7974-4dc6-804d-bde2b54c4b3f\") " pod="barbican-kuttl-tests/barbican-worker-746c8577-jwnfm" Oct 02 09:51:27 crc kubenswrapper[4722]: I1002 09:51:27.446695 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxz8w\" (UniqueName: \"kubernetes.io/projected/2627bdae-7974-4dc6-804d-bde2b54c4b3f-kube-api-access-qxz8w\") pod \"barbican-worker-746c8577-jwnfm\" (UID: \"2627bdae-7974-4dc6-804d-bde2b54c4b3f\") " pod="barbican-kuttl-tests/barbican-worker-746c8577-jwnfm" Oct 02 09:51:27 crc kubenswrapper[4722]: I1002 09:51:27.448604 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2627bdae-7974-4dc6-804d-bde2b54c4b3f-logs\") pod \"barbican-worker-746c8577-jwnfm\" (UID: \"2627bdae-7974-4dc6-804d-bde2b54c4b3f\") " pod="barbican-kuttl-tests/barbican-worker-746c8577-jwnfm" Oct 02 09:51:27 crc kubenswrapper[4722]: I1002 09:51:27.454042 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2627bdae-7974-4dc6-804d-bde2b54c4b3f-config-data-custom\") pod \"barbican-worker-746c8577-jwnfm\" (UID: \"2627bdae-7974-4dc6-804d-bde2b54c4b3f\") " pod="barbican-kuttl-tests/barbican-worker-746c8577-jwnfm" Oct 02 09:51:27 crc kubenswrapper[4722]: I1002 09:51:27.454082 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2627bdae-7974-4dc6-804d-bde2b54c4b3f-config-data\") pod \"barbican-worker-746c8577-jwnfm\" (UID: \"2627bdae-7974-4dc6-804d-bde2b54c4b3f\") " pod="barbican-kuttl-tests/barbican-worker-746c8577-jwnfm" Oct 02 09:51:27 crc kubenswrapper[4722]: I1002 09:51:27.465633 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxz8w\" (UniqueName: \"kubernetes.io/projected/2627bdae-7974-4dc6-804d-bde2b54c4b3f-kube-api-access-qxz8w\") pod \"barbican-worker-746c8577-jwnfm\" (UID: \"2627bdae-7974-4dc6-804d-bde2b54c4b3f\") " pod="barbican-kuttl-tests/barbican-worker-746c8577-jwnfm" Oct 02 09:51:27 crc kubenswrapper[4722]: I1002 09:51:27.486754 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-746c8577-jwnfm" Oct 02 09:51:27 crc kubenswrapper[4722]: I1002 09:51:27.500419 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-mqwgz" Oct 02 09:51:27 crc kubenswrapper[4722]: W1002 09:51:27.745370 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3603a0b3_7883_43ad_a297_d782d6afa067.slice/crio-da7d3a8e15de88121c5346fecc9ad66713de07c6af78b2115093938497653c8c WatchSource:0}: Error finding container da7d3a8e15de88121c5346fecc9ad66713de07c6af78b2115093938497653c8c: Status 404 returned error can't find the container with id da7d3a8e15de88121c5346fecc9ad66713de07c6af78b2115093938497653c8c Oct 02 09:51:28 crc kubenswrapper[4722]: I1002 09:51:28.277759 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-api-dd979cfb4-ppq8f"] Oct 02 09:51:28 crc kubenswrapper[4722]: I1002 09:51:28.300771 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-worker-746c8577-jwnfm"] Oct 02 09:51:28 crc kubenswrapper[4722]: I1002 09:51:28.354423 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-dd979cfb4-ppq8f" event={"ID":"3603a0b3-7883-43ad-a297-d782d6afa067","Type":"ContainerStarted","Data":"8831532a044c21604a32fa391a96bf539bb809489e61226bbf8792bc41822a02"} Oct 02 09:51:28 crc kubenswrapper[4722]: I1002 09:51:28.354466 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-dd979cfb4-ppq8f" event={"ID":"3603a0b3-7883-43ad-a297-d782d6afa067","Type":"ContainerStarted","Data":"da7d3a8e15de88121c5346fecc9ad66713de07c6af78b2115093938497653c8c"} Oct 02 09:51:28 crc kubenswrapper[4722]: I1002 09:51:28.357898 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-746c8577-jwnfm" event={"ID":"2627bdae-7974-4dc6-804d-bde2b54c4b3f","Type":"ContainerStarted","Data":"f0202407982d0f735c19f13ac562e8d61898666010141e99a92cc726e27d18d4"} Oct 02 09:51:28 crc kubenswrapper[4722]: I1002 09:51:28.364782 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-746c8577-5hmvk" event={"ID":"653b7a1c-d895-4141-a71d-b50a28085148","Type":"ContainerStarted","Data":"c16086f748292ed300e8e9e9917cb3cd01d11d5fb854ecc80a14c2c67393bf8d"} Oct 02 09:51:28 crc kubenswrapper[4722]: I1002 09:51:28.369617 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-jnbnb" event={"ID":"b995eb06-9120-4c50-8c99-16f42293107b","Type":"ContainerStarted","Data":"da431a2c9994cd236d7fd334961abd3c3e30ca34703b706853f93e29fbdd2024"} Oct 02 09:51:28 crc kubenswrapper[4722]: I1002 09:51:28.397394 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-7db7564864-mqwgz"] Oct 02 09:51:28 crc kubenswrapper[4722]: W1002 09:51:28.417595 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod837089fe_a05d_42fb_913c_41a52ed8e2a5.slice/crio-3925a05463ae35c7c96f98144aba31183e7074c93d6909da2824e864a13e4198 WatchSource:0}: Error finding container 3925a05463ae35c7c96f98144aba31183e7074c93d6909da2824e864a13e4198: Status 404 returned error can't find the container with id 3925a05463ae35c7c96f98144aba31183e7074c93d6909da2824e864a13e4198 Oct 02 09:51:28 crc kubenswrapper[4722]: I1002 09:51:28.450713 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-7db7564864-mqwgz"] Oct 02 09:51:28 crc kubenswrapper[4722]: I1002 09:51:28.765589 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-worker-746c8577-jwnfm"] Oct 02 09:51:29 crc kubenswrapper[4722]: I1002 09:51:29.378560 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-jnbnb" event={"ID":"b995eb06-9120-4c50-8c99-16f42293107b","Type":"ContainerStarted","Data":"8b3e04b826e49bebe05c2cb2c452ff7bb1b8680aebcdf3635e5f3c26c78327d7"} Oct 02 09:51:29 crc kubenswrapper[4722]: I1002 09:51:29.381863 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-mqwgz" event={"ID":"837089fe-a05d-42fb-913c-41a52ed8e2a5","Type":"ContainerStarted","Data":"6bc3f10e6ffbcfeb6d2aefe03a065cf85c1b0235f4091caeb2b3a328fec22120"} Oct 02 09:51:29 crc kubenswrapper[4722]: I1002 09:51:29.381916 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-mqwgz" event={"ID":"837089fe-a05d-42fb-913c-41a52ed8e2a5","Type":"ContainerStarted","Data":"d75f59501b769c4635f9254765268f015f311e6ef2750517e973be3d67a52f7f"} Oct 02 09:51:29 crc kubenswrapper[4722]: I1002 09:51:29.381933 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-mqwgz" event={"ID":"837089fe-a05d-42fb-913c-41a52ed8e2a5","Type":"ContainerStarted","Data":"3925a05463ae35c7c96f98144aba31183e7074c93d6909da2824e864a13e4198"} Oct 02 09:51:29 crc kubenswrapper[4722]: I1002 09:51:29.381997 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-mqwgz" podUID="837089fe-a05d-42fb-913c-41a52ed8e2a5" containerName="barbican-keystone-listener-log" containerID="cri-o://d75f59501b769c4635f9254765268f015f311e6ef2750517e973be3d67a52f7f" gracePeriod=30 Oct 02 09:51:29 crc kubenswrapper[4722]: I1002 09:51:29.382139 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-mqwgz" podUID="837089fe-a05d-42fb-913c-41a52ed8e2a5" containerName="barbican-keystone-listener" containerID="cri-o://6bc3f10e6ffbcfeb6d2aefe03a065cf85c1b0235f4091caeb2b3a328fec22120" gracePeriod=30 Oct 02 09:51:29 crc kubenswrapper[4722]: I1002 09:51:29.384034 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-dd979cfb4-ppq8f" event={"ID":"3603a0b3-7883-43ad-a297-d782d6afa067","Type":"ContainerStarted","Data":"041961d015b48db07fab64f781069e5936eb924b19cb9395ea5edcffd3acf0e8"} Oct 02 09:51:29 crc kubenswrapper[4722]: I1002 09:51:29.384152 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-api-dd979cfb4-ppq8f" podUID="3603a0b3-7883-43ad-a297-d782d6afa067" containerName="barbican-api-log" containerID="cri-o://8831532a044c21604a32fa391a96bf539bb809489e61226bbf8792bc41822a02" gracePeriod=30 Oct 02 09:51:29 crc kubenswrapper[4722]: I1002 09:51:29.384212 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/barbican-api-dd979cfb4-ppq8f" Oct 02 09:51:29 crc kubenswrapper[4722]: I1002 09:51:29.384236 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/barbican-api-dd979cfb4-ppq8f" Oct 02 09:51:29 crc kubenswrapper[4722]: I1002 09:51:29.384279 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-api-dd979cfb4-ppq8f" podUID="3603a0b3-7883-43ad-a297-d782d6afa067" containerName="barbican-api" containerID="cri-o://041961d015b48db07fab64f781069e5936eb924b19cb9395ea5edcffd3acf0e8" gracePeriod=30 Oct 02 09:51:29 crc kubenswrapper[4722]: I1002 09:51:29.386873 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-746c8577-jwnfm" event={"ID":"2627bdae-7974-4dc6-804d-bde2b54c4b3f","Type":"ContainerStarted","Data":"72f63d7b784dadda365f18472e454f742af83d6bac9d5cd47f9955de9ed59c66"} Oct 02 09:51:29 crc kubenswrapper[4722]: I1002 09:51:29.386912 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-746c8577-jwnfm" event={"ID":"2627bdae-7974-4dc6-804d-bde2b54c4b3f","Type":"ContainerStarted","Data":"47e1657efbc494976e2df4d68d82acc3e1768d0a1546a0dc78244208fefd7722"} Oct 02 09:51:29 crc kubenswrapper[4722]: I1002 09:51:29.392592 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-746c8577-5hmvk" event={"ID":"653b7a1c-d895-4141-a71d-b50a28085148","Type":"ContainerStarted","Data":"19a939edbe9af6cccf036fabab0de8ca5df8f2ca2f62fa538da175dd24e5390d"} Oct 02 09:51:29 crc kubenswrapper[4722]: I1002 09:51:29.401430 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-jnbnb" podStartSLOduration=2.704918844 podStartE2EDuration="4.401414864s" podCreationTimestamp="2025-10-02 09:51:25 +0000 UTC" firstStartedPulling="2025-10-02 09:51:26.366148298 +0000 UTC m=+1142.402901268" lastFinishedPulling="2025-10-02 09:51:28.062644308 +0000 UTC m=+1144.099397288" observedRunningTime="2025-10-02 09:51:29.398410969 +0000 UTC m=+1145.435163969" watchObservedRunningTime="2025-10-02 09:51:29.401414864 +0000 UTC m=+1145.438167844" Oct 02 09:51:29 crc kubenswrapper[4722]: I1002 09:51:29.416617 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-worker-746c8577-5hmvk" podStartSLOduration=2.472726286 podStartE2EDuration="4.416596691s" podCreationTimestamp="2025-10-02 09:51:25 +0000 UTC" firstStartedPulling="2025-10-02 09:51:26.113733828 +0000 UTC m=+1142.150486808" lastFinishedPulling="2025-10-02 09:51:28.057604233 +0000 UTC m=+1144.094357213" observedRunningTime="2025-10-02 09:51:29.413426432 +0000 UTC m=+1145.450179432" watchObservedRunningTime="2025-10-02 09:51:29.416596691 +0000 UTC m=+1145.453349681" Oct 02 09:51:29 crc kubenswrapper[4722]: I1002 09:51:29.434150 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-mqwgz" podStartSLOduration=3.434130636 podStartE2EDuration="3.434130636s" podCreationTimestamp="2025-10-02 09:51:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:51:29.43186532 +0000 UTC m=+1145.468618310" watchObservedRunningTime="2025-10-02 09:51:29.434130636 +0000 UTC m=+1145.470883616" Oct 02 09:51:29 crc kubenswrapper[4722]: I1002 09:51:29.458160 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-worker-746c8577-jwnfm" podStartSLOduration=2.458139423 podStartE2EDuration="2.458139423s" podCreationTimestamp="2025-10-02 09:51:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:51:29.453706363 +0000 UTC m=+1145.490459343" watchObservedRunningTime="2025-10-02 09:51:29.458139423 +0000 UTC m=+1145.494892413" Oct 02 09:51:29 crc kubenswrapper[4722]: I1002 09:51:29.475610 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-api-dd979cfb4-ppq8f" podStartSLOduration=3.475586536 podStartE2EDuration="3.475586536s" podCreationTimestamp="2025-10-02 09:51:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:51:29.468792327 +0000 UTC m=+1145.505545317" watchObservedRunningTime="2025-10-02 09:51:29.475586536 +0000 UTC m=+1145.512339516" Oct 02 09:51:29 crc kubenswrapper[4722]: I1002 09:51:29.834042 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-api-dd979cfb4-dszx2"] Oct 02 09:51:29 crc kubenswrapper[4722]: I1002 09:51:29.904029 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-dd979cfb4-ppq8f" Oct 02 09:51:29 crc kubenswrapper[4722]: I1002 09:51:29.985806 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3603a0b3-7883-43ad-a297-d782d6afa067-config-data\") pod \"3603a0b3-7883-43ad-a297-d782d6afa067\" (UID: \"3603a0b3-7883-43ad-a297-d782d6afa067\") " Oct 02 09:51:29 crc kubenswrapper[4722]: I1002 09:51:29.985861 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7dwd\" (UniqueName: \"kubernetes.io/projected/3603a0b3-7883-43ad-a297-d782d6afa067-kube-api-access-t7dwd\") pod \"3603a0b3-7883-43ad-a297-d782d6afa067\" (UID: \"3603a0b3-7883-43ad-a297-d782d6afa067\") " Oct 02 09:51:29 crc kubenswrapper[4722]: I1002 09:51:29.985912 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3603a0b3-7883-43ad-a297-d782d6afa067-config-data-custom\") pod \"3603a0b3-7883-43ad-a297-d782d6afa067\" (UID: \"3603a0b3-7883-43ad-a297-d782d6afa067\") " Oct 02 09:51:29 crc kubenswrapper[4722]: I1002 09:51:29.985932 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3603a0b3-7883-43ad-a297-d782d6afa067-logs\") pod \"3603a0b3-7883-43ad-a297-d782d6afa067\" (UID: \"3603a0b3-7883-43ad-a297-d782d6afa067\") " Oct 02 09:51:29 crc kubenswrapper[4722]: I1002 09:51:29.986483 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3603a0b3-7883-43ad-a297-d782d6afa067-logs" (OuterVolumeSpecName: "logs") pod "3603a0b3-7883-43ad-a297-d782d6afa067" (UID: "3603a0b3-7883-43ad-a297-d782d6afa067"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:51:29 crc kubenswrapper[4722]: I1002 09:51:29.991277 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3603a0b3-7883-43ad-a297-d782d6afa067-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3603a0b3-7883-43ad-a297-d782d6afa067" (UID: "3603a0b3-7883-43ad-a297-d782d6afa067"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:51:30 crc kubenswrapper[4722]: I1002 09:51:29.993565 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3603a0b3-7883-43ad-a297-d782d6afa067-kube-api-access-t7dwd" (OuterVolumeSpecName: "kube-api-access-t7dwd") pod "3603a0b3-7883-43ad-a297-d782d6afa067" (UID: "3603a0b3-7883-43ad-a297-d782d6afa067"). InnerVolumeSpecName "kube-api-access-t7dwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:51:30 crc kubenswrapper[4722]: I1002 09:51:30.025754 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3603a0b3-7883-43ad-a297-d782d6afa067-config-data" (OuterVolumeSpecName: "config-data") pod "3603a0b3-7883-43ad-a297-d782d6afa067" (UID: "3603a0b3-7883-43ad-a297-d782d6afa067"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:51:30 crc kubenswrapper[4722]: I1002 09:51:30.087851 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3603a0b3-7883-43ad-a297-d782d6afa067-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 09:51:30 crc kubenswrapper[4722]: I1002 09:51:30.087888 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3603a0b3-7883-43ad-a297-d782d6afa067-logs\") on node \"crc\" DevicePath \"\"" Oct 02 09:51:30 crc kubenswrapper[4722]: I1002 09:51:30.087899 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3603a0b3-7883-43ad-a297-d782d6afa067-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 09:51:30 crc kubenswrapper[4722]: I1002 09:51:30.087908 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7dwd\" (UniqueName: \"kubernetes.io/projected/3603a0b3-7883-43ad-a297-d782d6afa067-kube-api-access-t7dwd\") on node \"crc\" DevicePath \"\"" Oct 02 09:51:30 crc kubenswrapper[4722]: I1002 09:51:30.126528 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-7db7564864-jnbnb"] Oct 02 09:51:30 crc kubenswrapper[4722]: I1002 09:51:30.212149 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-worker-746c8577-5hmvk"] Oct 02 09:51:30 crc kubenswrapper[4722]: I1002 09:51:30.407468 4722 generic.go:334] "Generic (PLEG): container finished" podID="837089fe-a05d-42fb-913c-41a52ed8e2a5" containerID="d75f59501b769c4635f9254765268f015f311e6ef2750517e973be3d67a52f7f" exitCode=143 Oct 02 09:51:30 crc kubenswrapper[4722]: I1002 09:51:30.408552 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-mqwgz" event={"ID":"837089fe-a05d-42fb-913c-41a52ed8e2a5","Type":"ContainerDied","Data":"d75f59501b769c4635f9254765268f015f311e6ef2750517e973be3d67a52f7f"} Oct 02 09:51:30 crc kubenswrapper[4722]: I1002 09:51:30.410744 4722 generic.go:334] "Generic (PLEG): container finished" podID="3603a0b3-7883-43ad-a297-d782d6afa067" containerID="041961d015b48db07fab64f781069e5936eb924b19cb9395ea5edcffd3acf0e8" exitCode=0 Oct 02 09:51:30 crc kubenswrapper[4722]: I1002 09:51:30.410790 4722 generic.go:334] "Generic (PLEG): container finished" podID="3603a0b3-7883-43ad-a297-d782d6afa067" containerID="8831532a044c21604a32fa391a96bf539bb809489e61226bbf8792bc41822a02" exitCode=143 Oct 02 09:51:30 crc kubenswrapper[4722]: I1002 09:51:30.412078 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-dd979cfb4-ppq8f" Oct 02 09:51:30 crc kubenswrapper[4722]: I1002 09:51:30.412161 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-dd979cfb4-ppq8f" event={"ID":"3603a0b3-7883-43ad-a297-d782d6afa067","Type":"ContainerDied","Data":"041961d015b48db07fab64f781069e5936eb924b19cb9395ea5edcffd3acf0e8"} Oct 02 09:51:30 crc kubenswrapper[4722]: I1002 09:51:30.412231 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-dd979cfb4-ppq8f" event={"ID":"3603a0b3-7883-43ad-a297-d782d6afa067","Type":"ContainerDied","Data":"8831532a044c21604a32fa391a96bf539bb809489e61226bbf8792bc41822a02"} Oct 02 09:51:30 crc kubenswrapper[4722]: I1002 09:51:30.412266 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-dd979cfb4-ppq8f" event={"ID":"3603a0b3-7883-43ad-a297-d782d6afa067","Type":"ContainerDied","Data":"da7d3a8e15de88121c5346fecc9ad66713de07c6af78b2115093938497653c8c"} Oct 02 09:51:30 crc kubenswrapper[4722]: I1002 09:51:30.412295 4722 scope.go:117] "RemoveContainer" containerID="041961d015b48db07fab64f781069e5936eb924b19cb9395ea5edcffd3acf0e8" Oct 02 09:51:30 crc kubenswrapper[4722]: I1002 09:51:30.414359 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-api-dd979cfb4-dszx2" podUID="da8316eb-4db3-4ca4-a64b-3ed59814a0b8" containerName="barbican-api-log" containerID="cri-o://5494dcf8c5d45261e20f33df9e3f498e3bfff47d1d99fecaf41ebe3f4483d42c" gracePeriod=30 Oct 02 09:51:30 crc kubenswrapper[4722]: I1002 09:51:30.414546 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-worker-746c8577-jwnfm" podUID="2627bdae-7974-4dc6-804d-bde2b54c4b3f" containerName="barbican-worker-log" containerID="cri-o://47e1657efbc494976e2df4d68d82acc3e1768d0a1546a0dc78244208fefd7722" gracePeriod=30 Oct 02 09:51:30 crc kubenswrapper[4722]: I1002 09:51:30.414718 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-api-dd979cfb4-dszx2" podUID="da8316eb-4db3-4ca4-a64b-3ed59814a0b8" containerName="barbican-api" containerID="cri-o://adf597397b6e96585ffa917145f078bc55467cee7170a6f2e1e737b930204ee1" gracePeriod=30 Oct 02 09:51:30 crc kubenswrapper[4722]: I1002 09:51:30.414933 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-worker-746c8577-jwnfm" podUID="2627bdae-7974-4dc6-804d-bde2b54c4b3f" containerName="barbican-worker" containerID="cri-o://72f63d7b784dadda365f18472e454f742af83d6bac9d5cd47f9955de9ed59c66" gracePeriod=30 Oct 02 09:51:30 crc kubenswrapper[4722]: I1002 09:51:30.441410 4722 scope.go:117] "RemoveContainer" containerID="8831532a044c21604a32fa391a96bf539bb809489e61226bbf8792bc41822a02" Oct 02 09:51:30 crc kubenswrapper[4722]: I1002 09:51:30.455901 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-api-dd979cfb4-ppq8f"] Oct 02 09:51:30 crc kubenswrapper[4722]: I1002 09:51:30.462554 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-api-dd979cfb4-ppq8f"] Oct 02 09:51:30 crc kubenswrapper[4722]: I1002 09:51:30.463546 4722 scope.go:117] "RemoveContainer" containerID="041961d015b48db07fab64f781069e5936eb924b19cb9395ea5edcffd3acf0e8" Oct 02 09:51:30 crc kubenswrapper[4722]: E1002 09:51:30.464494 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"041961d015b48db07fab64f781069e5936eb924b19cb9395ea5edcffd3acf0e8\": container with ID starting with 041961d015b48db07fab64f781069e5936eb924b19cb9395ea5edcffd3acf0e8 not found: ID does not exist" containerID="041961d015b48db07fab64f781069e5936eb924b19cb9395ea5edcffd3acf0e8" Oct 02 09:51:30 crc kubenswrapper[4722]: I1002 09:51:30.464538 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"041961d015b48db07fab64f781069e5936eb924b19cb9395ea5edcffd3acf0e8"} err="failed to get container status \"041961d015b48db07fab64f781069e5936eb924b19cb9395ea5edcffd3acf0e8\": rpc error: code = NotFound desc = could not find container \"041961d015b48db07fab64f781069e5936eb924b19cb9395ea5edcffd3acf0e8\": container with ID starting with 041961d015b48db07fab64f781069e5936eb924b19cb9395ea5edcffd3acf0e8 not found: ID does not exist" Oct 02 09:51:30 crc kubenswrapper[4722]: I1002 09:51:30.464566 4722 scope.go:117] "RemoveContainer" containerID="8831532a044c21604a32fa391a96bf539bb809489e61226bbf8792bc41822a02" Oct 02 09:51:30 crc kubenswrapper[4722]: E1002 09:51:30.465025 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8831532a044c21604a32fa391a96bf539bb809489e61226bbf8792bc41822a02\": container with ID starting with 8831532a044c21604a32fa391a96bf539bb809489e61226bbf8792bc41822a02 not found: ID does not exist" containerID="8831532a044c21604a32fa391a96bf539bb809489e61226bbf8792bc41822a02" Oct 02 09:51:30 crc kubenswrapper[4722]: I1002 09:51:30.465155 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8831532a044c21604a32fa391a96bf539bb809489e61226bbf8792bc41822a02"} err="failed to get container status \"8831532a044c21604a32fa391a96bf539bb809489e61226bbf8792bc41822a02\": rpc error: code = NotFound desc = could not find container \"8831532a044c21604a32fa391a96bf539bb809489e61226bbf8792bc41822a02\": container with ID starting with 8831532a044c21604a32fa391a96bf539bb809489e61226bbf8792bc41822a02 not found: ID does not exist" Oct 02 09:51:30 crc kubenswrapper[4722]: I1002 09:51:30.465239 4722 scope.go:117] "RemoveContainer" containerID="041961d015b48db07fab64f781069e5936eb924b19cb9395ea5edcffd3acf0e8" Oct 02 09:51:30 crc kubenswrapper[4722]: I1002 09:51:30.465611 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"041961d015b48db07fab64f781069e5936eb924b19cb9395ea5edcffd3acf0e8"} err="failed to get container status \"041961d015b48db07fab64f781069e5936eb924b19cb9395ea5edcffd3acf0e8\": rpc error: code = NotFound desc = could not find container \"041961d015b48db07fab64f781069e5936eb924b19cb9395ea5edcffd3acf0e8\": container with ID starting with 041961d015b48db07fab64f781069e5936eb924b19cb9395ea5edcffd3acf0e8 not found: ID does not exist" Oct 02 09:51:30 crc kubenswrapper[4722]: I1002 09:51:30.465629 4722 scope.go:117] "RemoveContainer" containerID="8831532a044c21604a32fa391a96bf539bb809489e61226bbf8792bc41822a02" Oct 02 09:51:30 crc kubenswrapper[4722]: I1002 09:51:30.466368 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8831532a044c21604a32fa391a96bf539bb809489e61226bbf8792bc41822a02"} err="failed to get container status \"8831532a044c21604a32fa391a96bf539bb809489e61226bbf8792bc41822a02\": rpc error: code = NotFound desc = could not find container \"8831532a044c21604a32fa391a96bf539bb809489e61226bbf8792bc41822a02\": container with ID starting with 8831532a044c21604a32fa391a96bf539bb809489e61226bbf8792bc41822a02 not found: ID does not exist" Oct 02 09:51:30 crc kubenswrapper[4722]: I1002 09:51:30.724615 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3603a0b3-7883-43ad-a297-d782d6afa067" path="/var/lib/kubelet/pods/3603a0b3-7883-43ad-a297-d782d6afa067/volumes" Oct 02 09:51:30 crc kubenswrapper[4722]: I1002 09:51:30.906808 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-dd979cfb4-dszx2" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.000059 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da8316eb-4db3-4ca4-a64b-3ed59814a0b8-logs\") pod \"da8316eb-4db3-4ca4-a64b-3ed59814a0b8\" (UID: \"da8316eb-4db3-4ca4-a64b-3ed59814a0b8\") " Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.000113 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da8316eb-4db3-4ca4-a64b-3ed59814a0b8-config-data\") pod \"da8316eb-4db3-4ca4-a64b-3ed59814a0b8\" (UID: \"da8316eb-4db3-4ca4-a64b-3ed59814a0b8\") " Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.000261 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jr645\" (UniqueName: \"kubernetes.io/projected/da8316eb-4db3-4ca4-a64b-3ed59814a0b8-kube-api-access-jr645\") pod \"da8316eb-4db3-4ca4-a64b-3ed59814a0b8\" (UID: \"da8316eb-4db3-4ca4-a64b-3ed59814a0b8\") " Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.000299 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da8316eb-4db3-4ca4-a64b-3ed59814a0b8-config-data-custom\") pod \"da8316eb-4db3-4ca4-a64b-3ed59814a0b8\" (UID: \"da8316eb-4db3-4ca4-a64b-3ed59814a0b8\") " Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.001464 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da8316eb-4db3-4ca4-a64b-3ed59814a0b8-logs" (OuterVolumeSpecName: "logs") pod "da8316eb-4db3-4ca4-a64b-3ed59814a0b8" (UID: "da8316eb-4db3-4ca4-a64b-3ed59814a0b8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.004479 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da8316eb-4db3-4ca4-a64b-3ed59814a0b8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "da8316eb-4db3-4ca4-a64b-3ed59814a0b8" (UID: "da8316eb-4db3-4ca4-a64b-3ed59814a0b8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.004637 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da8316eb-4db3-4ca4-a64b-3ed59814a0b8-kube-api-access-jr645" (OuterVolumeSpecName: "kube-api-access-jr645") pod "da8316eb-4db3-4ca4-a64b-3ed59814a0b8" (UID: "da8316eb-4db3-4ca4-a64b-3ed59814a0b8"). InnerVolumeSpecName "kube-api-access-jr645". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.044801 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da8316eb-4db3-4ca4-a64b-3ed59814a0b8-config-data" (OuterVolumeSpecName: "config-data") pod "da8316eb-4db3-4ca4-a64b-3ed59814a0b8" (UID: "da8316eb-4db3-4ca4-a64b-3ed59814a0b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.073052 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-746c8577-jwnfm" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.101740 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da8316eb-4db3-4ca4-a64b-3ed59814a0b8-logs\") on node \"crc\" DevicePath \"\"" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.101774 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da8316eb-4db3-4ca4-a64b-3ed59814a0b8-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.101786 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jr645\" (UniqueName: \"kubernetes.io/projected/da8316eb-4db3-4ca4-a64b-3ed59814a0b8-kube-api-access-jr645\") on node \"crc\" DevicePath \"\"" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.101796 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da8316eb-4db3-4ca4-a64b-3ed59814a0b8-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.203217 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxz8w\" (UniqueName: \"kubernetes.io/projected/2627bdae-7974-4dc6-804d-bde2b54c4b3f-kube-api-access-qxz8w\") pod \"2627bdae-7974-4dc6-804d-bde2b54c4b3f\" (UID: \"2627bdae-7974-4dc6-804d-bde2b54c4b3f\") " Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.203307 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2627bdae-7974-4dc6-804d-bde2b54c4b3f-logs\") pod \"2627bdae-7974-4dc6-804d-bde2b54c4b3f\" (UID: \"2627bdae-7974-4dc6-804d-bde2b54c4b3f\") " Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.203385 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2627bdae-7974-4dc6-804d-bde2b54c4b3f-config-data-custom\") pod \"2627bdae-7974-4dc6-804d-bde2b54c4b3f\" (UID: \"2627bdae-7974-4dc6-804d-bde2b54c4b3f\") " Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.203414 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2627bdae-7974-4dc6-804d-bde2b54c4b3f-config-data\") pod \"2627bdae-7974-4dc6-804d-bde2b54c4b3f\" (UID: \"2627bdae-7974-4dc6-804d-bde2b54c4b3f\") " Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.203700 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2627bdae-7974-4dc6-804d-bde2b54c4b3f-logs" (OuterVolumeSpecName: "logs") pod "2627bdae-7974-4dc6-804d-bde2b54c4b3f" (UID: "2627bdae-7974-4dc6-804d-bde2b54c4b3f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.207486 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2627bdae-7974-4dc6-804d-bde2b54c4b3f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2627bdae-7974-4dc6-804d-bde2b54c4b3f" (UID: "2627bdae-7974-4dc6-804d-bde2b54c4b3f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.207778 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2627bdae-7974-4dc6-804d-bde2b54c4b3f-kube-api-access-qxz8w" (OuterVolumeSpecName: "kube-api-access-qxz8w") pod "2627bdae-7974-4dc6-804d-bde2b54c4b3f" (UID: "2627bdae-7974-4dc6-804d-bde2b54c4b3f"). InnerVolumeSpecName "kube-api-access-qxz8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.239089 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2627bdae-7974-4dc6-804d-bde2b54c4b3f-config-data" (OuterVolumeSpecName: "config-data") pod "2627bdae-7974-4dc6-804d-bde2b54c4b3f" (UID: "2627bdae-7974-4dc6-804d-bde2b54c4b3f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.305232 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxz8w\" (UniqueName: \"kubernetes.io/projected/2627bdae-7974-4dc6-804d-bde2b54c4b3f-kube-api-access-qxz8w\") on node \"crc\" DevicePath \"\"" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.305274 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2627bdae-7974-4dc6-804d-bde2b54c4b3f-logs\") on node \"crc\" DevicePath \"\"" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.305286 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2627bdae-7974-4dc6-804d-bde2b54c4b3f-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.305294 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2627bdae-7974-4dc6-804d-bde2b54c4b3f-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.395561 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-zh7g9"] Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.401696 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-zh7g9"] Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.421005 4722 generic.go:334] "Generic (PLEG): container finished" podID="2627bdae-7974-4dc6-804d-bde2b54c4b3f" containerID="72f63d7b784dadda365f18472e454f742af83d6bac9d5cd47f9955de9ed59c66" exitCode=0 Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.421040 4722 generic.go:334] "Generic (PLEG): container finished" podID="2627bdae-7974-4dc6-804d-bde2b54c4b3f" containerID="47e1657efbc494976e2df4d68d82acc3e1768d0a1546a0dc78244208fefd7722" exitCode=143 Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.421089 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-746c8577-jwnfm" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.421104 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-746c8577-jwnfm" event={"ID":"2627bdae-7974-4dc6-804d-bde2b54c4b3f","Type":"ContainerDied","Data":"72f63d7b784dadda365f18472e454f742af83d6bac9d5cd47f9955de9ed59c66"} Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.421155 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-746c8577-jwnfm" event={"ID":"2627bdae-7974-4dc6-804d-bde2b54c4b3f","Type":"ContainerDied","Data":"47e1657efbc494976e2df4d68d82acc3e1768d0a1546a0dc78244208fefd7722"} Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.421171 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-746c8577-jwnfm" event={"ID":"2627bdae-7974-4dc6-804d-bde2b54c4b3f","Type":"ContainerDied","Data":"f0202407982d0f735c19f13ac562e8d61898666010141e99a92cc726e27d18d4"} Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.421189 4722 scope.go:117] "RemoveContainer" containerID="72f63d7b784dadda365f18472e454f742af83d6bac9d5cd47f9955de9ed59c66" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.422986 4722 generic.go:334] "Generic (PLEG): container finished" podID="da8316eb-4db3-4ca4-a64b-3ed59814a0b8" containerID="adf597397b6e96585ffa917145f078bc55467cee7170a6f2e1e737b930204ee1" exitCode=0 Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.423008 4722 generic.go:334] "Generic (PLEG): container finished" podID="da8316eb-4db3-4ca4-a64b-3ed59814a0b8" containerID="5494dcf8c5d45261e20f33df9e3f498e3bfff47d1d99fecaf41ebe3f4483d42c" exitCode=143 Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.423046 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-dd979cfb4-dszx2" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.423053 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-dd979cfb4-dszx2" event={"ID":"da8316eb-4db3-4ca4-a64b-3ed59814a0b8","Type":"ContainerDied","Data":"adf597397b6e96585ffa917145f078bc55467cee7170a6f2e1e737b930204ee1"} Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.423086 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-dd979cfb4-dszx2" event={"ID":"da8316eb-4db3-4ca4-a64b-3ed59814a0b8","Type":"ContainerDied","Data":"5494dcf8c5d45261e20f33df9e3f498e3bfff47d1d99fecaf41ebe3f4483d42c"} Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.423100 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-dd979cfb4-dszx2" event={"ID":"da8316eb-4db3-4ca4-a64b-3ed59814a0b8","Type":"ContainerDied","Data":"505281987dfe87ea5827d5ee0a6d204ed08930facc95e64d067f345082f49e32"} Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.427051 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-jnbnb" podUID="b995eb06-9120-4c50-8c99-16f42293107b" containerName="barbican-keystone-listener-log" containerID="cri-o://da431a2c9994cd236d7fd334961abd3c3e30ca34703b706853f93e29fbdd2024" gracePeriod=30 Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.427433 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-worker-746c8577-5hmvk" podUID="653b7a1c-d895-4141-a71d-b50a28085148" containerName="barbican-worker-log" containerID="cri-o://c16086f748292ed300e8e9e9917cb3cd01d11d5fb854ecc80a14c2c67393bf8d" gracePeriod=30 Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.427487 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-jnbnb" podUID="b995eb06-9120-4c50-8c99-16f42293107b" containerName="barbican-keystone-listener" containerID="cri-o://8b3e04b826e49bebe05c2cb2c452ff7bb1b8680aebcdf3635e5f3c26c78327d7" gracePeriod=30 Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.427529 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-worker-746c8577-5hmvk" podUID="653b7a1c-d895-4141-a71d-b50a28085148" containerName="barbican-worker" containerID="cri-o://19a939edbe9af6cccf036fabab0de8ca5df8f2ca2f62fa538da175dd24e5390d" gracePeriod=30 Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.453238 4722 scope.go:117] "RemoveContainer" containerID="47e1657efbc494976e2df4d68d82acc3e1768d0a1546a0dc78244208fefd7722" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.477395 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican153a-account-delete-tn4h5"] Oct 02 09:51:31 crc kubenswrapper[4722]: E1002 09:51:31.477718 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da8316eb-4db3-4ca4-a64b-3ed59814a0b8" containerName="barbican-api-log" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.477735 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="da8316eb-4db3-4ca4-a64b-3ed59814a0b8" containerName="barbican-api-log" Oct 02 09:51:31 crc kubenswrapper[4722]: E1002 09:51:31.477744 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3603a0b3-7883-43ad-a297-d782d6afa067" containerName="barbican-api-log" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.477750 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3603a0b3-7883-43ad-a297-d782d6afa067" containerName="barbican-api-log" Oct 02 09:51:31 crc kubenswrapper[4722]: E1002 09:51:31.477757 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3603a0b3-7883-43ad-a297-d782d6afa067" containerName="barbican-api" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.477763 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3603a0b3-7883-43ad-a297-d782d6afa067" containerName="barbican-api" Oct 02 09:51:31 crc kubenswrapper[4722]: E1002 09:51:31.477777 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2627bdae-7974-4dc6-804d-bde2b54c4b3f" containerName="barbican-worker-log" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.477783 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2627bdae-7974-4dc6-804d-bde2b54c4b3f" containerName="barbican-worker-log" Oct 02 09:51:31 crc kubenswrapper[4722]: E1002 09:51:31.477797 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2627bdae-7974-4dc6-804d-bde2b54c4b3f" containerName="barbican-worker" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.477803 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2627bdae-7974-4dc6-804d-bde2b54c4b3f" containerName="barbican-worker" Oct 02 09:51:31 crc kubenswrapper[4722]: E1002 09:51:31.477819 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da8316eb-4db3-4ca4-a64b-3ed59814a0b8" containerName="barbican-api" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.477824 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="da8316eb-4db3-4ca4-a64b-3ed59814a0b8" containerName="barbican-api" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.477946 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="3603a0b3-7883-43ad-a297-d782d6afa067" containerName="barbican-api" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.477962 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="2627bdae-7974-4dc6-804d-bde2b54c4b3f" containerName="barbican-worker-log" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.477972 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="3603a0b3-7883-43ad-a297-d782d6afa067" containerName="barbican-api-log" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.477981 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="da8316eb-4db3-4ca4-a64b-3ed59814a0b8" containerName="barbican-api-log" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.477989 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="da8316eb-4db3-4ca4-a64b-3ed59814a0b8" containerName="barbican-api" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.477998 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="2627bdae-7974-4dc6-804d-bde2b54c4b3f" containerName="barbican-worker" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.478519 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican153a-account-delete-tn4h5" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.488440 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican153a-account-delete-tn4h5"] Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.490176 4722 scope.go:117] "RemoveContainer" containerID="72f63d7b784dadda365f18472e454f742af83d6bac9d5cd47f9955de9ed59c66" Oct 02 09:51:31 crc kubenswrapper[4722]: E1002 09:51:31.494023 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72f63d7b784dadda365f18472e454f742af83d6bac9d5cd47f9955de9ed59c66\": container with ID starting with 72f63d7b784dadda365f18472e454f742af83d6bac9d5cd47f9955de9ed59c66 not found: ID does not exist" containerID="72f63d7b784dadda365f18472e454f742af83d6bac9d5cd47f9955de9ed59c66" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.494066 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72f63d7b784dadda365f18472e454f742af83d6bac9d5cd47f9955de9ed59c66"} err="failed to get container status \"72f63d7b784dadda365f18472e454f742af83d6bac9d5cd47f9955de9ed59c66\": rpc error: code = NotFound desc = could not find container \"72f63d7b784dadda365f18472e454f742af83d6bac9d5cd47f9955de9ed59c66\": container with ID starting with 72f63d7b784dadda365f18472e454f742af83d6bac9d5cd47f9955de9ed59c66 not found: ID does not exist" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.494090 4722 scope.go:117] "RemoveContainer" containerID="47e1657efbc494976e2df4d68d82acc3e1768d0a1546a0dc78244208fefd7722" Oct 02 09:51:31 crc kubenswrapper[4722]: E1002 09:51:31.495139 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47e1657efbc494976e2df4d68d82acc3e1768d0a1546a0dc78244208fefd7722\": container with ID starting with 47e1657efbc494976e2df4d68d82acc3e1768d0a1546a0dc78244208fefd7722 not found: ID does not exist" containerID="47e1657efbc494976e2df4d68d82acc3e1768d0a1546a0dc78244208fefd7722" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.495170 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47e1657efbc494976e2df4d68d82acc3e1768d0a1546a0dc78244208fefd7722"} err="failed to get container status \"47e1657efbc494976e2df4d68d82acc3e1768d0a1546a0dc78244208fefd7722\": rpc error: code = NotFound desc = could not find container \"47e1657efbc494976e2df4d68d82acc3e1768d0a1546a0dc78244208fefd7722\": container with ID starting with 47e1657efbc494976e2df4d68d82acc3e1768d0a1546a0dc78244208fefd7722 not found: ID does not exist" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.495185 4722 scope.go:117] "RemoveContainer" containerID="72f63d7b784dadda365f18472e454f742af83d6bac9d5cd47f9955de9ed59c66" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.497373 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72f63d7b784dadda365f18472e454f742af83d6bac9d5cd47f9955de9ed59c66"} err="failed to get container status \"72f63d7b784dadda365f18472e454f742af83d6bac9d5cd47f9955de9ed59c66\": rpc error: code = NotFound desc = could not find container \"72f63d7b784dadda365f18472e454f742af83d6bac9d5cd47f9955de9ed59c66\": container with ID starting with 72f63d7b784dadda365f18472e454f742af83d6bac9d5cd47f9955de9ed59c66 not found: ID does not exist" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.497411 4722 scope.go:117] "RemoveContainer" containerID="47e1657efbc494976e2df4d68d82acc3e1768d0a1546a0dc78244208fefd7722" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.497709 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47e1657efbc494976e2df4d68d82acc3e1768d0a1546a0dc78244208fefd7722"} err="failed to get container status \"47e1657efbc494976e2df4d68d82acc3e1768d0a1546a0dc78244208fefd7722\": rpc error: code = NotFound desc = could not find container \"47e1657efbc494976e2df4d68d82acc3e1768d0a1546a0dc78244208fefd7722\": container with ID starting with 47e1657efbc494976e2df4d68d82acc3e1768d0a1546a0dc78244208fefd7722 not found: ID does not exist" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.497747 4722 scope.go:117] "RemoveContainer" containerID="adf597397b6e96585ffa917145f078bc55467cee7170a6f2e1e737b930204ee1" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.503883 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-api-dd979cfb4-dszx2"] Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.509737 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-api-dd979cfb4-dszx2"] Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.518212 4722 scope.go:117] "RemoveContainer" containerID="5494dcf8c5d45261e20f33df9e3f498e3bfff47d1d99fecaf41ebe3f4483d42c" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.521035 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-worker-746c8577-jwnfm"] Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.526293 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-worker-746c8577-jwnfm"] Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.540462 4722 scope.go:117] "RemoveContainer" containerID="adf597397b6e96585ffa917145f078bc55467cee7170a6f2e1e737b930204ee1" Oct 02 09:51:31 crc kubenswrapper[4722]: E1002 09:51:31.541743 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adf597397b6e96585ffa917145f078bc55467cee7170a6f2e1e737b930204ee1\": container with ID starting with adf597397b6e96585ffa917145f078bc55467cee7170a6f2e1e737b930204ee1 not found: ID does not exist" containerID="adf597397b6e96585ffa917145f078bc55467cee7170a6f2e1e737b930204ee1" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.541783 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adf597397b6e96585ffa917145f078bc55467cee7170a6f2e1e737b930204ee1"} err="failed to get container status \"adf597397b6e96585ffa917145f078bc55467cee7170a6f2e1e737b930204ee1\": rpc error: code = NotFound desc = could not find container \"adf597397b6e96585ffa917145f078bc55467cee7170a6f2e1e737b930204ee1\": container with ID starting with adf597397b6e96585ffa917145f078bc55467cee7170a6f2e1e737b930204ee1 not found: ID does not exist" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.541811 4722 scope.go:117] "RemoveContainer" containerID="5494dcf8c5d45261e20f33df9e3f498e3bfff47d1d99fecaf41ebe3f4483d42c" Oct 02 09:51:31 crc kubenswrapper[4722]: E1002 09:51:31.542095 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5494dcf8c5d45261e20f33df9e3f498e3bfff47d1d99fecaf41ebe3f4483d42c\": container with ID starting with 5494dcf8c5d45261e20f33df9e3f498e3bfff47d1d99fecaf41ebe3f4483d42c not found: ID does not exist" containerID="5494dcf8c5d45261e20f33df9e3f498e3bfff47d1d99fecaf41ebe3f4483d42c" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.542121 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5494dcf8c5d45261e20f33df9e3f498e3bfff47d1d99fecaf41ebe3f4483d42c"} err="failed to get container status \"5494dcf8c5d45261e20f33df9e3f498e3bfff47d1d99fecaf41ebe3f4483d42c\": rpc error: code = NotFound desc = could not find container \"5494dcf8c5d45261e20f33df9e3f498e3bfff47d1d99fecaf41ebe3f4483d42c\": container with ID starting with 5494dcf8c5d45261e20f33df9e3f498e3bfff47d1d99fecaf41ebe3f4483d42c not found: ID does not exist" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.542139 4722 scope.go:117] "RemoveContainer" containerID="adf597397b6e96585ffa917145f078bc55467cee7170a6f2e1e737b930204ee1" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.542366 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adf597397b6e96585ffa917145f078bc55467cee7170a6f2e1e737b930204ee1"} err="failed to get container status \"adf597397b6e96585ffa917145f078bc55467cee7170a6f2e1e737b930204ee1\": rpc error: code = NotFound desc = could not find container \"adf597397b6e96585ffa917145f078bc55467cee7170a6f2e1e737b930204ee1\": container with ID starting with adf597397b6e96585ffa917145f078bc55467cee7170a6f2e1e737b930204ee1 not found: ID does not exist" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.542385 4722 scope.go:117] "RemoveContainer" containerID="5494dcf8c5d45261e20f33df9e3f498e3bfff47d1d99fecaf41ebe3f4483d42c" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.542583 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5494dcf8c5d45261e20f33df9e3f498e3bfff47d1d99fecaf41ebe3f4483d42c"} err="failed to get container status \"5494dcf8c5d45261e20f33df9e3f498e3bfff47d1d99fecaf41ebe3f4483d42c\": rpc error: code = NotFound desc = could not find container \"5494dcf8c5d45261e20f33df9e3f498e3bfff47d1d99fecaf41ebe3f4483d42c\": container with ID starting with 5494dcf8c5d45261e20f33df9e3f498e3bfff47d1d99fecaf41ebe3f4483d42c not found: ID does not exist" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.608563 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cmr2\" (UniqueName: \"kubernetes.io/projected/3b577de1-217a-4c34-9227-e1a5834211aa-kube-api-access-8cmr2\") pod \"barbican153a-account-delete-tn4h5\" (UID: \"3b577de1-217a-4c34-9227-e1a5834211aa\") " pod="barbican-kuttl-tests/barbican153a-account-delete-tn4h5" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.710094 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cmr2\" (UniqueName: \"kubernetes.io/projected/3b577de1-217a-4c34-9227-e1a5834211aa-kube-api-access-8cmr2\") pod \"barbican153a-account-delete-tn4h5\" (UID: \"3b577de1-217a-4c34-9227-e1a5834211aa\") " pod="barbican-kuttl-tests/barbican153a-account-delete-tn4h5" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.728291 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cmr2\" (UniqueName: \"kubernetes.io/projected/3b577de1-217a-4c34-9227-e1a5834211aa-kube-api-access-8cmr2\") pod \"barbican153a-account-delete-tn4h5\" (UID: \"3b577de1-217a-4c34-9227-e1a5834211aa\") " pod="barbican-kuttl-tests/barbican153a-account-delete-tn4h5" Oct 02 09:51:31 crc kubenswrapper[4722]: I1002 09:51:31.798451 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican153a-account-delete-tn4h5" Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.257148 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican153a-account-delete-tn4h5"] Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.432780 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-jnbnb" Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.435289 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican153a-account-delete-tn4h5" event={"ID":"3b577de1-217a-4c34-9227-e1a5834211aa","Type":"ContainerStarted","Data":"586e255b3847b471bf81a4e852599bc9f405ffcb14877812ad6203e1e956d0d3"} Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.444122 4722 generic.go:334] "Generic (PLEG): container finished" podID="653b7a1c-d895-4141-a71d-b50a28085148" containerID="c16086f748292ed300e8e9e9917cb3cd01d11d5fb854ecc80a14c2c67393bf8d" exitCode=143 Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.444204 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-746c8577-5hmvk" event={"ID":"653b7a1c-d895-4141-a71d-b50a28085148","Type":"ContainerDied","Data":"c16086f748292ed300e8e9e9917cb3cd01d11d5fb854ecc80a14c2c67393bf8d"} Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.446631 4722 generic.go:334] "Generic (PLEG): container finished" podID="b995eb06-9120-4c50-8c99-16f42293107b" containerID="8b3e04b826e49bebe05c2cb2c452ff7bb1b8680aebcdf3635e5f3c26c78327d7" exitCode=0 Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.446663 4722 generic.go:334] "Generic (PLEG): container finished" podID="b995eb06-9120-4c50-8c99-16f42293107b" containerID="da431a2c9994cd236d7fd334961abd3c3e30ca34703b706853f93e29fbdd2024" exitCode=143 Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.446665 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-jnbnb" Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.446716 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-jnbnb" event={"ID":"b995eb06-9120-4c50-8c99-16f42293107b","Type":"ContainerDied","Data":"8b3e04b826e49bebe05c2cb2c452ff7bb1b8680aebcdf3635e5f3c26c78327d7"} Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.446751 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-jnbnb" event={"ID":"b995eb06-9120-4c50-8c99-16f42293107b","Type":"ContainerDied","Data":"da431a2c9994cd236d7fd334961abd3c3e30ca34703b706853f93e29fbdd2024"} Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.446763 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-jnbnb" event={"ID":"b995eb06-9120-4c50-8c99-16f42293107b","Type":"ContainerDied","Data":"578c5648d404ee8e6a39a2f1b40fa818c862f244f0144c59bf22edb2e37ea9d7"} Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.446783 4722 scope.go:117] "RemoveContainer" containerID="8b3e04b826e49bebe05c2cb2c452ff7bb1b8680aebcdf3635e5f3c26c78327d7" Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.521764 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b995eb06-9120-4c50-8c99-16f42293107b-config-data-custom\") pod \"b995eb06-9120-4c50-8c99-16f42293107b\" (UID: \"b995eb06-9120-4c50-8c99-16f42293107b\") " Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.521896 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b995eb06-9120-4c50-8c99-16f42293107b-logs\") pod \"b995eb06-9120-4c50-8c99-16f42293107b\" (UID: \"b995eb06-9120-4c50-8c99-16f42293107b\") " Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.522008 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcprw\" (UniqueName: \"kubernetes.io/projected/b995eb06-9120-4c50-8c99-16f42293107b-kube-api-access-qcprw\") pod \"b995eb06-9120-4c50-8c99-16f42293107b\" (UID: \"b995eb06-9120-4c50-8c99-16f42293107b\") " Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.522089 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b995eb06-9120-4c50-8c99-16f42293107b-config-data\") pod \"b995eb06-9120-4c50-8c99-16f42293107b\" (UID: \"b995eb06-9120-4c50-8c99-16f42293107b\") " Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.523474 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b995eb06-9120-4c50-8c99-16f42293107b-logs" (OuterVolumeSpecName: "logs") pod "b995eb06-9120-4c50-8c99-16f42293107b" (UID: "b995eb06-9120-4c50-8c99-16f42293107b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.530095 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b995eb06-9120-4c50-8c99-16f42293107b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b995eb06-9120-4c50-8c99-16f42293107b" (UID: "b995eb06-9120-4c50-8c99-16f42293107b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.530857 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b995eb06-9120-4c50-8c99-16f42293107b-kube-api-access-qcprw" (OuterVolumeSpecName: "kube-api-access-qcprw") pod "b995eb06-9120-4c50-8c99-16f42293107b" (UID: "b995eb06-9120-4c50-8c99-16f42293107b"). InnerVolumeSpecName "kube-api-access-qcprw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.534234 4722 scope.go:117] "RemoveContainer" containerID="da431a2c9994cd236d7fd334961abd3c3e30ca34703b706853f93e29fbdd2024" Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.552397 4722 scope.go:117] "RemoveContainer" containerID="8b3e04b826e49bebe05c2cb2c452ff7bb1b8680aebcdf3635e5f3c26c78327d7" Oct 02 09:51:32 crc kubenswrapper[4722]: E1002 09:51:32.553287 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b3e04b826e49bebe05c2cb2c452ff7bb1b8680aebcdf3635e5f3c26c78327d7\": container with ID starting with 8b3e04b826e49bebe05c2cb2c452ff7bb1b8680aebcdf3635e5f3c26c78327d7 not found: ID does not exist" containerID="8b3e04b826e49bebe05c2cb2c452ff7bb1b8680aebcdf3635e5f3c26c78327d7" Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.553338 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b3e04b826e49bebe05c2cb2c452ff7bb1b8680aebcdf3635e5f3c26c78327d7"} err="failed to get container status \"8b3e04b826e49bebe05c2cb2c452ff7bb1b8680aebcdf3635e5f3c26c78327d7\": rpc error: code = NotFound desc = could not find container \"8b3e04b826e49bebe05c2cb2c452ff7bb1b8680aebcdf3635e5f3c26c78327d7\": container with ID starting with 8b3e04b826e49bebe05c2cb2c452ff7bb1b8680aebcdf3635e5f3c26c78327d7 not found: ID does not exist" Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.553369 4722 scope.go:117] "RemoveContainer" containerID="da431a2c9994cd236d7fd334961abd3c3e30ca34703b706853f93e29fbdd2024" Oct 02 09:51:32 crc kubenswrapper[4722]: E1002 09:51:32.553730 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da431a2c9994cd236d7fd334961abd3c3e30ca34703b706853f93e29fbdd2024\": container with ID starting with da431a2c9994cd236d7fd334961abd3c3e30ca34703b706853f93e29fbdd2024 not found: ID does not exist" containerID="da431a2c9994cd236d7fd334961abd3c3e30ca34703b706853f93e29fbdd2024" Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.553752 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da431a2c9994cd236d7fd334961abd3c3e30ca34703b706853f93e29fbdd2024"} err="failed to get container status \"da431a2c9994cd236d7fd334961abd3c3e30ca34703b706853f93e29fbdd2024\": rpc error: code = NotFound desc = could not find container \"da431a2c9994cd236d7fd334961abd3c3e30ca34703b706853f93e29fbdd2024\": container with ID starting with da431a2c9994cd236d7fd334961abd3c3e30ca34703b706853f93e29fbdd2024 not found: ID does not exist" Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.553768 4722 scope.go:117] "RemoveContainer" containerID="8b3e04b826e49bebe05c2cb2c452ff7bb1b8680aebcdf3635e5f3c26c78327d7" Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.554024 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b3e04b826e49bebe05c2cb2c452ff7bb1b8680aebcdf3635e5f3c26c78327d7"} err="failed to get container status \"8b3e04b826e49bebe05c2cb2c452ff7bb1b8680aebcdf3635e5f3c26c78327d7\": rpc error: code = NotFound desc = could not find container \"8b3e04b826e49bebe05c2cb2c452ff7bb1b8680aebcdf3635e5f3c26c78327d7\": container with ID starting with 8b3e04b826e49bebe05c2cb2c452ff7bb1b8680aebcdf3635e5f3c26c78327d7 not found: ID does not exist" Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.554041 4722 scope.go:117] "RemoveContainer" containerID="da431a2c9994cd236d7fd334961abd3c3e30ca34703b706853f93e29fbdd2024" Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.554259 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da431a2c9994cd236d7fd334961abd3c3e30ca34703b706853f93e29fbdd2024"} err="failed to get container status \"da431a2c9994cd236d7fd334961abd3c3e30ca34703b706853f93e29fbdd2024\": rpc error: code = NotFound desc = could not find container \"da431a2c9994cd236d7fd334961abd3c3e30ca34703b706853f93e29fbdd2024\": container with ID starting with da431a2c9994cd236d7fd334961abd3c3e30ca34703b706853f93e29fbdd2024 not found: ID does not exist" Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.575685 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b995eb06-9120-4c50-8c99-16f42293107b-config-data" (OuterVolumeSpecName: "config-data") pod "b995eb06-9120-4c50-8c99-16f42293107b" (UID: "b995eb06-9120-4c50-8c99-16f42293107b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.626197 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcprw\" (UniqueName: \"kubernetes.io/projected/b995eb06-9120-4c50-8c99-16f42293107b-kube-api-access-qcprw\") on node \"crc\" DevicePath \"\"" Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.626245 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b995eb06-9120-4c50-8c99-16f42293107b-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.626260 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b995eb06-9120-4c50-8c99-16f42293107b-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.626282 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b995eb06-9120-4c50-8c99-16f42293107b-logs\") on node \"crc\" DevicePath \"\"" Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.724723 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2627bdae-7974-4dc6-804d-bde2b54c4b3f" path="/var/lib/kubelet/pods/2627bdae-7974-4dc6-804d-bde2b54c4b3f/volumes" Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.725641 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da8316eb-4db3-4ca4-a64b-3ed59814a0b8" path="/var/lib/kubelet/pods/da8316eb-4db3-4ca4-a64b-3ed59814a0b8/volumes" Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.726481 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f31283d3-960a-4d9e-809f-6c98df068e39" path="/var/lib/kubelet/pods/f31283d3-960a-4d9e-809f-6c98df068e39/volumes" Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.747901 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-746c8577-5hmvk" Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.778697 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-7db7564864-jnbnb"] Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.785191 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-7db7564864-jnbnb"] Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.829155 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kmgz\" (UniqueName: \"kubernetes.io/projected/653b7a1c-d895-4141-a71d-b50a28085148-kube-api-access-2kmgz\") pod \"653b7a1c-d895-4141-a71d-b50a28085148\" (UID: \"653b7a1c-d895-4141-a71d-b50a28085148\") " Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.829452 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/653b7a1c-d895-4141-a71d-b50a28085148-config-data\") pod \"653b7a1c-d895-4141-a71d-b50a28085148\" (UID: \"653b7a1c-d895-4141-a71d-b50a28085148\") " Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.829492 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/653b7a1c-d895-4141-a71d-b50a28085148-config-data-custom\") pod \"653b7a1c-d895-4141-a71d-b50a28085148\" (UID: \"653b7a1c-d895-4141-a71d-b50a28085148\") " Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.829522 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/653b7a1c-d895-4141-a71d-b50a28085148-logs\") pod \"653b7a1c-d895-4141-a71d-b50a28085148\" (UID: \"653b7a1c-d895-4141-a71d-b50a28085148\") " Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.829977 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/653b7a1c-d895-4141-a71d-b50a28085148-logs" (OuterVolumeSpecName: "logs") pod "653b7a1c-d895-4141-a71d-b50a28085148" (UID: "653b7a1c-d895-4141-a71d-b50a28085148"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.832360 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/653b7a1c-d895-4141-a71d-b50a28085148-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "653b7a1c-d895-4141-a71d-b50a28085148" (UID: "653b7a1c-d895-4141-a71d-b50a28085148"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.833602 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/653b7a1c-d895-4141-a71d-b50a28085148-kube-api-access-2kmgz" (OuterVolumeSpecName: "kube-api-access-2kmgz") pod "653b7a1c-d895-4141-a71d-b50a28085148" (UID: "653b7a1c-d895-4141-a71d-b50a28085148"). InnerVolumeSpecName "kube-api-access-2kmgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.865304 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/653b7a1c-d895-4141-a71d-b50a28085148-config-data" (OuterVolumeSpecName: "config-data") pod "653b7a1c-d895-4141-a71d-b50a28085148" (UID: "653b7a1c-d895-4141-a71d-b50a28085148"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.931588 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/653b7a1c-d895-4141-a71d-b50a28085148-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.931631 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/653b7a1c-d895-4141-a71d-b50a28085148-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.931647 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/653b7a1c-d895-4141-a71d-b50a28085148-logs\") on node \"crc\" DevicePath \"\"" Oct 02 09:51:32 crc kubenswrapper[4722]: I1002 09:51:32.931658 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kmgz\" (UniqueName: \"kubernetes.io/projected/653b7a1c-d895-4141-a71d-b50a28085148-kube-api-access-2kmgz\") on node \"crc\" DevicePath \"\"" Oct 02 09:51:33 crc kubenswrapper[4722]: I1002 09:51:33.456796 4722 generic.go:334] "Generic (PLEG): container finished" podID="3b577de1-217a-4c34-9227-e1a5834211aa" containerID="03658ec23435e475b875109a154a717e637a684536a983dea5d585f51fed5a12" exitCode=0 Oct 02 09:51:33 crc kubenswrapper[4722]: I1002 09:51:33.456878 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican153a-account-delete-tn4h5" event={"ID":"3b577de1-217a-4c34-9227-e1a5834211aa","Type":"ContainerDied","Data":"03658ec23435e475b875109a154a717e637a684536a983dea5d585f51fed5a12"} Oct 02 09:51:33 crc kubenswrapper[4722]: I1002 09:51:33.459280 4722 generic.go:334] "Generic (PLEG): container finished" podID="653b7a1c-d895-4141-a71d-b50a28085148" containerID="19a939edbe9af6cccf036fabab0de8ca5df8f2ca2f62fa538da175dd24e5390d" exitCode=0 Oct 02 09:51:33 crc kubenswrapper[4722]: I1002 09:51:33.459342 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-746c8577-5hmvk" event={"ID":"653b7a1c-d895-4141-a71d-b50a28085148","Type":"ContainerDied","Data":"19a939edbe9af6cccf036fabab0de8ca5df8f2ca2f62fa538da175dd24e5390d"} Oct 02 09:51:33 crc kubenswrapper[4722]: I1002 09:51:33.459368 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-746c8577-5hmvk" event={"ID":"653b7a1c-d895-4141-a71d-b50a28085148","Type":"ContainerDied","Data":"67693f198c8ada2b2796d7514f65d168feec8b8a45897bcc16e6944d189fbbd8"} Oct 02 09:51:33 crc kubenswrapper[4722]: I1002 09:51:33.459376 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-746c8577-5hmvk" Oct 02 09:51:33 crc kubenswrapper[4722]: I1002 09:51:33.459385 4722 scope.go:117] "RemoveContainer" containerID="19a939edbe9af6cccf036fabab0de8ca5df8f2ca2f62fa538da175dd24e5390d" Oct 02 09:51:33 crc kubenswrapper[4722]: I1002 09:51:33.477271 4722 scope.go:117] "RemoveContainer" containerID="c16086f748292ed300e8e9e9917cb3cd01d11d5fb854ecc80a14c2c67393bf8d" Oct 02 09:51:33 crc kubenswrapper[4722]: I1002 09:51:33.494149 4722 scope.go:117] "RemoveContainer" containerID="19a939edbe9af6cccf036fabab0de8ca5df8f2ca2f62fa538da175dd24e5390d" Oct 02 09:51:33 crc kubenswrapper[4722]: E1002 09:51:33.496011 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19a939edbe9af6cccf036fabab0de8ca5df8f2ca2f62fa538da175dd24e5390d\": container with ID starting with 19a939edbe9af6cccf036fabab0de8ca5df8f2ca2f62fa538da175dd24e5390d not found: ID does not exist" containerID="19a939edbe9af6cccf036fabab0de8ca5df8f2ca2f62fa538da175dd24e5390d" Oct 02 09:51:33 crc kubenswrapper[4722]: I1002 09:51:33.496071 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19a939edbe9af6cccf036fabab0de8ca5df8f2ca2f62fa538da175dd24e5390d"} err="failed to get container status \"19a939edbe9af6cccf036fabab0de8ca5df8f2ca2f62fa538da175dd24e5390d\": rpc error: code = NotFound desc = could not find container \"19a939edbe9af6cccf036fabab0de8ca5df8f2ca2f62fa538da175dd24e5390d\": container with ID starting with 19a939edbe9af6cccf036fabab0de8ca5df8f2ca2f62fa538da175dd24e5390d not found: ID does not exist" Oct 02 09:51:33 crc kubenswrapper[4722]: I1002 09:51:33.496106 4722 scope.go:117] "RemoveContainer" containerID="c16086f748292ed300e8e9e9917cb3cd01d11d5fb854ecc80a14c2c67393bf8d" Oct 02 09:51:33 crc kubenswrapper[4722]: E1002 09:51:33.497364 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c16086f748292ed300e8e9e9917cb3cd01d11d5fb854ecc80a14c2c67393bf8d\": container with ID starting with c16086f748292ed300e8e9e9917cb3cd01d11d5fb854ecc80a14c2c67393bf8d not found: ID does not exist" containerID="c16086f748292ed300e8e9e9917cb3cd01d11d5fb854ecc80a14c2c67393bf8d" Oct 02 09:51:33 crc kubenswrapper[4722]: I1002 09:51:33.497419 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c16086f748292ed300e8e9e9917cb3cd01d11d5fb854ecc80a14c2c67393bf8d"} err="failed to get container status \"c16086f748292ed300e8e9e9917cb3cd01d11d5fb854ecc80a14c2c67393bf8d\": rpc error: code = NotFound desc = could not find container \"c16086f748292ed300e8e9e9917cb3cd01d11d5fb854ecc80a14c2c67393bf8d\": container with ID starting with c16086f748292ed300e8e9e9917cb3cd01d11d5fb854ecc80a14c2c67393bf8d not found: ID does not exist" Oct 02 09:51:33 crc kubenswrapper[4722]: I1002 09:51:33.503675 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-worker-746c8577-5hmvk"] Oct 02 09:51:33 crc kubenswrapper[4722]: I1002 09:51:33.508463 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-worker-746c8577-5hmvk"] Oct 02 09:51:34 crc kubenswrapper[4722]: I1002 09:51:34.716496 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican153a-account-delete-tn4h5" Oct 02 09:51:34 crc kubenswrapper[4722]: I1002 09:51:34.717578 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="653b7a1c-d895-4141-a71d-b50a28085148" path="/var/lib/kubelet/pods/653b7a1c-d895-4141-a71d-b50a28085148/volumes" Oct 02 09:51:34 crc kubenswrapper[4722]: I1002 09:51:34.718205 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b995eb06-9120-4c50-8c99-16f42293107b" path="/var/lib/kubelet/pods/b995eb06-9120-4c50-8c99-16f42293107b/volumes" Oct 02 09:51:34 crc kubenswrapper[4722]: I1002 09:51:34.862101 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cmr2\" (UniqueName: \"kubernetes.io/projected/3b577de1-217a-4c34-9227-e1a5834211aa-kube-api-access-8cmr2\") pod \"3b577de1-217a-4c34-9227-e1a5834211aa\" (UID: \"3b577de1-217a-4c34-9227-e1a5834211aa\") " Oct 02 09:51:34 crc kubenswrapper[4722]: I1002 09:51:34.867512 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b577de1-217a-4c34-9227-e1a5834211aa-kube-api-access-8cmr2" (OuterVolumeSpecName: "kube-api-access-8cmr2") pod "3b577de1-217a-4c34-9227-e1a5834211aa" (UID: "3b577de1-217a-4c34-9227-e1a5834211aa"). InnerVolumeSpecName "kube-api-access-8cmr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:51:34 crc kubenswrapper[4722]: I1002 09:51:34.964243 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cmr2\" (UniqueName: \"kubernetes.io/projected/3b577de1-217a-4c34-9227-e1a5834211aa-kube-api-access-8cmr2\") on node \"crc\" DevicePath \"\"" Oct 02 09:51:35 crc kubenswrapper[4722]: I1002 09:51:35.481230 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican153a-account-delete-tn4h5" event={"ID":"3b577de1-217a-4c34-9227-e1a5834211aa","Type":"ContainerDied","Data":"586e255b3847b471bf81a4e852599bc9f405ffcb14877812ad6203e1e956d0d3"} Oct 02 09:51:35 crc kubenswrapper[4722]: I1002 09:51:35.481292 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="586e255b3847b471bf81a4e852599bc9f405ffcb14877812ad6203e1e956d0d3" Oct 02 09:51:35 crc kubenswrapper[4722]: I1002 09:51:35.481299 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican153a-account-delete-tn4h5" Oct 02 09:51:36 crc kubenswrapper[4722]: I1002 09:51:36.470447 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-db-create-dn5np"] Oct 02 09:51:36 crc kubenswrapper[4722]: I1002 09:51:36.477999 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-db-create-dn5np"] Oct 02 09:51:36 crc kubenswrapper[4722]: I1002 09:51:36.482914 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-153a-account-create-bxl99"] Oct 02 09:51:36 crc kubenswrapper[4722]: I1002 09:51:36.488838 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-153a-account-create-bxl99"] Oct 02 09:51:36 crc kubenswrapper[4722]: I1002 09:51:36.494385 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican153a-account-delete-tn4h5"] Oct 02 09:51:36 crc kubenswrapper[4722]: I1002 09:51:36.499654 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican153a-account-delete-tn4h5"] Oct 02 09:51:36 crc kubenswrapper[4722]: I1002 09:51:36.526079 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-db-create-zzdpc"] Oct 02 09:51:36 crc kubenswrapper[4722]: E1002 09:51:36.526371 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="653b7a1c-d895-4141-a71d-b50a28085148" containerName="barbican-worker" Oct 02 09:51:36 crc kubenswrapper[4722]: I1002 09:51:36.526390 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="653b7a1c-d895-4141-a71d-b50a28085148" containerName="barbican-worker" Oct 02 09:51:36 crc kubenswrapper[4722]: E1002 09:51:36.526404 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="653b7a1c-d895-4141-a71d-b50a28085148" containerName="barbican-worker-log" Oct 02 09:51:36 crc kubenswrapper[4722]: I1002 09:51:36.526413 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="653b7a1c-d895-4141-a71d-b50a28085148" containerName="barbican-worker-log" Oct 02 09:51:36 crc kubenswrapper[4722]: E1002 09:51:36.526433 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b995eb06-9120-4c50-8c99-16f42293107b" containerName="barbican-keystone-listener-log" Oct 02 09:51:36 crc kubenswrapper[4722]: I1002 09:51:36.526440 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b995eb06-9120-4c50-8c99-16f42293107b" containerName="barbican-keystone-listener-log" Oct 02 09:51:36 crc kubenswrapper[4722]: E1002 09:51:36.526455 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b995eb06-9120-4c50-8c99-16f42293107b" containerName="barbican-keystone-listener" Oct 02 09:51:36 crc kubenswrapper[4722]: I1002 09:51:36.526461 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b995eb06-9120-4c50-8c99-16f42293107b" containerName="barbican-keystone-listener" Oct 02 09:51:36 crc kubenswrapper[4722]: E1002 09:51:36.526467 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b577de1-217a-4c34-9227-e1a5834211aa" containerName="mariadb-account-delete" Oct 02 09:51:36 crc kubenswrapper[4722]: I1002 09:51:36.526473 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b577de1-217a-4c34-9227-e1a5834211aa" containerName="mariadb-account-delete" Oct 02 09:51:36 crc kubenswrapper[4722]: I1002 09:51:36.526592 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b995eb06-9120-4c50-8c99-16f42293107b" containerName="barbican-keystone-listener" Oct 02 09:51:36 crc kubenswrapper[4722]: I1002 09:51:36.526604 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b577de1-217a-4c34-9227-e1a5834211aa" containerName="mariadb-account-delete" Oct 02 09:51:36 crc kubenswrapper[4722]: I1002 09:51:36.526612 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="653b7a1c-d895-4141-a71d-b50a28085148" containerName="barbican-worker" Oct 02 09:51:36 crc kubenswrapper[4722]: I1002 09:51:36.526621 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b995eb06-9120-4c50-8c99-16f42293107b" containerName="barbican-keystone-listener-log" Oct 02 09:51:36 crc kubenswrapper[4722]: I1002 09:51:36.526630 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="653b7a1c-d895-4141-a71d-b50a28085148" containerName="barbican-worker-log" Oct 02 09:51:36 crc kubenswrapper[4722]: I1002 09:51:36.527114 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-create-zzdpc" Oct 02 09:51:36 crc kubenswrapper[4722]: I1002 09:51:36.536964 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-db-create-zzdpc"] Oct 02 09:51:36 crc kubenswrapper[4722]: I1002 09:51:36.589010 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxhkb\" (UniqueName: \"kubernetes.io/projected/0609424e-afb3-4d80-bf7a-bf8a09a0a2e8-kube-api-access-lxhkb\") pod \"barbican-db-create-zzdpc\" (UID: \"0609424e-afb3-4d80-bf7a-bf8a09a0a2e8\") " pod="barbican-kuttl-tests/barbican-db-create-zzdpc" Oct 02 09:51:36 crc kubenswrapper[4722]: I1002 09:51:36.690693 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxhkb\" (UniqueName: \"kubernetes.io/projected/0609424e-afb3-4d80-bf7a-bf8a09a0a2e8-kube-api-access-lxhkb\") pod \"barbican-db-create-zzdpc\" (UID: \"0609424e-afb3-4d80-bf7a-bf8a09a0a2e8\") " pod="barbican-kuttl-tests/barbican-db-create-zzdpc" Oct 02 09:51:36 crc kubenswrapper[4722]: I1002 09:51:36.707398 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxhkb\" (UniqueName: \"kubernetes.io/projected/0609424e-afb3-4d80-bf7a-bf8a09a0a2e8-kube-api-access-lxhkb\") pod \"barbican-db-create-zzdpc\" (UID: \"0609424e-afb3-4d80-bf7a-bf8a09a0a2e8\") " pod="barbican-kuttl-tests/barbican-db-create-zzdpc" Oct 02 09:51:36 crc kubenswrapper[4722]: I1002 09:51:36.718383 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b577de1-217a-4c34-9227-e1a5834211aa" path="/var/lib/kubelet/pods/3b577de1-217a-4c34-9227-e1a5834211aa/volumes" Oct 02 09:51:36 crc kubenswrapper[4722]: I1002 09:51:36.718938 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4e64138-2948-4ba2-9cab-d87233718abe" path="/var/lib/kubelet/pods/d4e64138-2948-4ba2-9cab-d87233718abe/volumes" Oct 02 09:51:36 crc kubenswrapper[4722]: I1002 09:51:36.719462 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe829993-3dea-4c1c-826f-3d8cea2363e7" path="/var/lib/kubelet/pods/fe829993-3dea-4c1c-826f-3d8cea2363e7/volumes" Oct 02 09:51:36 crc kubenswrapper[4722]: I1002 09:51:36.849804 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-create-zzdpc" Oct 02 09:51:37 crc kubenswrapper[4722]: I1002 09:51:37.250721 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-db-create-zzdpc"] Oct 02 09:51:37 crc kubenswrapper[4722]: I1002 09:51:37.494650 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-create-zzdpc" event={"ID":"0609424e-afb3-4d80-bf7a-bf8a09a0a2e8","Type":"ContainerStarted","Data":"dc20522dccaacb53807c7cf36a60ea5ba4c79fcb9d602621e7f87d9b33f2337b"} Oct 02 09:51:38 crc kubenswrapper[4722]: I1002 09:51:38.501666 4722 generic.go:334] "Generic (PLEG): container finished" podID="0609424e-afb3-4d80-bf7a-bf8a09a0a2e8" containerID="914876550fa92687eacf3ef88ca106da7cae7d5ed05ec61721d9880195afa93c" exitCode=0 Oct 02 09:51:38 crc kubenswrapper[4722]: I1002 09:51:38.501719 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-create-zzdpc" event={"ID":"0609424e-afb3-4d80-bf7a-bf8a09a0a2e8","Type":"ContainerDied","Data":"914876550fa92687eacf3ef88ca106da7cae7d5ed05ec61721d9880195afa93c"} Oct 02 09:51:39 crc kubenswrapper[4722]: I1002 09:51:39.748028 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-create-zzdpc" Oct 02 09:51:39 crc kubenswrapper[4722]: I1002 09:51:39.840607 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxhkb\" (UniqueName: \"kubernetes.io/projected/0609424e-afb3-4d80-bf7a-bf8a09a0a2e8-kube-api-access-lxhkb\") pod \"0609424e-afb3-4d80-bf7a-bf8a09a0a2e8\" (UID: \"0609424e-afb3-4d80-bf7a-bf8a09a0a2e8\") " Oct 02 09:51:39 crc kubenswrapper[4722]: I1002 09:51:39.846578 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0609424e-afb3-4d80-bf7a-bf8a09a0a2e8-kube-api-access-lxhkb" (OuterVolumeSpecName: "kube-api-access-lxhkb") pod "0609424e-afb3-4d80-bf7a-bf8a09a0a2e8" (UID: "0609424e-afb3-4d80-bf7a-bf8a09a0a2e8"). InnerVolumeSpecName "kube-api-access-lxhkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:51:39 crc kubenswrapper[4722]: I1002 09:51:39.941893 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxhkb\" (UniqueName: \"kubernetes.io/projected/0609424e-afb3-4d80-bf7a-bf8a09a0a2e8-kube-api-access-lxhkb\") on node \"crc\" DevicePath \"\"" Oct 02 09:51:40 crc kubenswrapper[4722]: I1002 09:51:40.518199 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-create-zzdpc" event={"ID":"0609424e-afb3-4d80-bf7a-bf8a09a0a2e8","Type":"ContainerDied","Data":"dc20522dccaacb53807c7cf36a60ea5ba4c79fcb9d602621e7f87d9b33f2337b"} Oct 02 09:51:40 crc kubenswrapper[4722]: I1002 09:51:40.518465 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc20522dccaacb53807c7cf36a60ea5ba4c79fcb9d602621e7f87d9b33f2337b" Oct 02 09:51:40 crc kubenswrapper[4722]: I1002 09:51:40.518440 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-create-zzdpc" Oct 02 09:51:46 crc kubenswrapper[4722]: I1002 09:51:46.648971 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-c513-account-create-fpzlj"] Oct 02 09:51:46 crc kubenswrapper[4722]: E1002 09:51:46.649844 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0609424e-afb3-4d80-bf7a-bf8a09a0a2e8" containerName="mariadb-database-create" Oct 02 09:51:46 crc kubenswrapper[4722]: I1002 09:51:46.649857 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0609424e-afb3-4d80-bf7a-bf8a09a0a2e8" containerName="mariadb-database-create" Oct 02 09:51:46 crc kubenswrapper[4722]: I1002 09:51:46.649980 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="0609424e-afb3-4d80-bf7a-bf8a09a0a2e8" containerName="mariadb-database-create" Oct 02 09:51:46 crc kubenswrapper[4722]: I1002 09:51:46.650510 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-c513-account-create-fpzlj" Oct 02 09:51:46 crc kubenswrapper[4722]: I1002 09:51:46.653745 4722 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-db-secret" Oct 02 09:51:46 crc kubenswrapper[4722]: I1002 09:51:46.657122 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-c513-account-create-fpzlj"] Oct 02 09:51:46 crc kubenswrapper[4722]: I1002 09:51:46.729279 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4f7m\" (UniqueName: \"kubernetes.io/projected/a2358339-8968-42f4-9f6d-05933fb3e2f7-kube-api-access-j4f7m\") pod \"barbican-c513-account-create-fpzlj\" (UID: \"a2358339-8968-42f4-9f6d-05933fb3e2f7\") " pod="barbican-kuttl-tests/barbican-c513-account-create-fpzlj" Oct 02 09:51:46 crc kubenswrapper[4722]: I1002 09:51:46.831194 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4f7m\" (UniqueName: \"kubernetes.io/projected/a2358339-8968-42f4-9f6d-05933fb3e2f7-kube-api-access-j4f7m\") pod \"barbican-c513-account-create-fpzlj\" (UID: \"a2358339-8968-42f4-9f6d-05933fb3e2f7\") " pod="barbican-kuttl-tests/barbican-c513-account-create-fpzlj" Oct 02 09:51:46 crc kubenswrapper[4722]: I1002 09:51:46.865798 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4f7m\" (UniqueName: \"kubernetes.io/projected/a2358339-8968-42f4-9f6d-05933fb3e2f7-kube-api-access-j4f7m\") pod \"barbican-c513-account-create-fpzlj\" (UID: \"a2358339-8968-42f4-9f6d-05933fb3e2f7\") " pod="barbican-kuttl-tests/barbican-c513-account-create-fpzlj" Oct 02 09:51:46 crc kubenswrapper[4722]: I1002 09:51:46.975133 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-c513-account-create-fpzlj" Oct 02 09:51:47 crc kubenswrapper[4722]: I1002 09:51:47.468197 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-c513-account-create-fpzlj"] Oct 02 09:51:47 crc kubenswrapper[4722]: I1002 09:51:47.568090 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-c513-account-create-fpzlj" event={"ID":"a2358339-8968-42f4-9f6d-05933fb3e2f7","Type":"ContainerStarted","Data":"e50f36d30085a22774a1afd26e940cddf5149ce3d181e95d379d822191bd37c7"} Oct 02 09:51:48 crc kubenswrapper[4722]: I1002 09:51:48.576647 4722 generic.go:334] "Generic (PLEG): container finished" podID="a2358339-8968-42f4-9f6d-05933fb3e2f7" containerID="9c8d9cdffe3739783b9c84aadf5882794ba34aa4f179ab4efcece743b2dbc3c1" exitCode=0 Oct 02 09:51:48 crc kubenswrapper[4722]: I1002 09:51:48.576706 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-c513-account-create-fpzlj" event={"ID":"a2358339-8968-42f4-9f6d-05933fb3e2f7","Type":"ContainerDied","Data":"9c8d9cdffe3739783b9c84aadf5882794ba34aa4f179ab4efcece743b2dbc3c1"} Oct 02 09:51:49 crc kubenswrapper[4722]: I1002 09:51:49.845767 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-c513-account-create-fpzlj" Oct 02 09:51:49 crc kubenswrapper[4722]: I1002 09:51:49.974610 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4f7m\" (UniqueName: \"kubernetes.io/projected/a2358339-8968-42f4-9f6d-05933fb3e2f7-kube-api-access-j4f7m\") pod \"a2358339-8968-42f4-9f6d-05933fb3e2f7\" (UID: \"a2358339-8968-42f4-9f6d-05933fb3e2f7\") " Oct 02 09:51:49 crc kubenswrapper[4722]: I1002 09:51:49.981050 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2358339-8968-42f4-9f6d-05933fb3e2f7-kube-api-access-j4f7m" (OuterVolumeSpecName: "kube-api-access-j4f7m") pod "a2358339-8968-42f4-9f6d-05933fb3e2f7" (UID: "a2358339-8968-42f4-9f6d-05933fb3e2f7"). InnerVolumeSpecName "kube-api-access-j4f7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:51:50 crc kubenswrapper[4722]: I1002 09:51:50.077552 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4f7m\" (UniqueName: \"kubernetes.io/projected/a2358339-8968-42f4-9f6d-05933fb3e2f7-kube-api-access-j4f7m\") on node \"crc\" DevicePath \"\"" Oct 02 09:51:50 crc kubenswrapper[4722]: I1002 09:51:50.591661 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-c513-account-create-fpzlj" event={"ID":"a2358339-8968-42f4-9f6d-05933fb3e2f7","Type":"ContainerDied","Data":"e50f36d30085a22774a1afd26e940cddf5149ce3d181e95d379d822191bd37c7"} Oct 02 09:51:50 crc kubenswrapper[4722]: I1002 09:51:50.591706 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-c513-account-create-fpzlj" Oct 02 09:51:50 crc kubenswrapper[4722]: I1002 09:51:50.592246 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e50f36d30085a22774a1afd26e940cddf5149ce3d181e95d379d822191bd37c7" Oct 02 09:51:51 crc kubenswrapper[4722]: I1002 09:51:51.816756 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-nqdbc"] Oct 02 09:51:51 crc kubenswrapper[4722]: E1002 09:51:51.817032 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2358339-8968-42f4-9f6d-05933fb3e2f7" containerName="mariadb-account-create" Oct 02 09:51:51 crc kubenswrapper[4722]: I1002 09:51:51.817047 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2358339-8968-42f4-9f6d-05933fb3e2f7" containerName="mariadb-account-create" Oct 02 09:51:51 crc kubenswrapper[4722]: I1002 09:51:51.817179 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2358339-8968-42f4-9f6d-05933fb3e2f7" containerName="mariadb-account-create" Oct 02 09:51:51 crc kubenswrapper[4722]: I1002 09:51:51.817637 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-sync-nqdbc" Oct 02 09:51:51 crc kubenswrapper[4722]: I1002 09:51:51.824905 4722 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"combined-ca-bundle" Oct 02 09:51:51 crc kubenswrapper[4722]: I1002 09:51:51.824905 4722 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-barbican-dockercfg-ct4tg" Oct 02 09:51:51 crc kubenswrapper[4722]: I1002 09:51:51.831251 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-nqdbc"] Oct 02 09:51:51 crc kubenswrapper[4722]: I1002 09:51:51.901001 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9b5a58f-c0f4-4e76-88ef-842a2f936ee4-combined-ca-bundle\") pod \"barbican-db-sync-nqdbc\" (UID: \"b9b5a58f-c0f4-4e76-88ef-842a2f936ee4\") " pod="barbican-kuttl-tests/barbican-db-sync-nqdbc" Oct 02 09:51:51 crc kubenswrapper[4722]: I1002 09:51:51.901066 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b9b5a58f-c0f4-4e76-88ef-842a2f936ee4-db-sync-config-data\") pod \"barbican-db-sync-nqdbc\" (UID: \"b9b5a58f-c0f4-4e76-88ef-842a2f936ee4\") " pod="barbican-kuttl-tests/barbican-db-sync-nqdbc" Oct 02 09:51:51 crc kubenswrapper[4722]: I1002 09:51:51.901098 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f28c8\" (UniqueName: \"kubernetes.io/projected/b9b5a58f-c0f4-4e76-88ef-842a2f936ee4-kube-api-access-f28c8\") pod \"barbican-db-sync-nqdbc\" (UID: \"b9b5a58f-c0f4-4e76-88ef-842a2f936ee4\") " pod="barbican-kuttl-tests/barbican-db-sync-nqdbc" Oct 02 09:51:52 crc kubenswrapper[4722]: I1002 09:51:52.002507 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b9b5a58f-c0f4-4e76-88ef-842a2f936ee4-db-sync-config-data\") pod \"barbican-db-sync-nqdbc\" (UID: \"b9b5a58f-c0f4-4e76-88ef-842a2f936ee4\") " pod="barbican-kuttl-tests/barbican-db-sync-nqdbc" Oct 02 09:51:52 crc kubenswrapper[4722]: I1002 09:51:52.002583 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f28c8\" (UniqueName: \"kubernetes.io/projected/b9b5a58f-c0f4-4e76-88ef-842a2f936ee4-kube-api-access-f28c8\") pod \"barbican-db-sync-nqdbc\" (UID: \"b9b5a58f-c0f4-4e76-88ef-842a2f936ee4\") " pod="barbican-kuttl-tests/barbican-db-sync-nqdbc" Oct 02 09:51:52 crc kubenswrapper[4722]: I1002 09:51:52.002682 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9b5a58f-c0f4-4e76-88ef-842a2f936ee4-combined-ca-bundle\") pod \"barbican-db-sync-nqdbc\" (UID: \"b9b5a58f-c0f4-4e76-88ef-842a2f936ee4\") " pod="barbican-kuttl-tests/barbican-db-sync-nqdbc" Oct 02 09:51:52 crc kubenswrapper[4722]: I1002 09:51:52.007915 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9b5a58f-c0f4-4e76-88ef-842a2f936ee4-combined-ca-bundle\") pod \"barbican-db-sync-nqdbc\" (UID: \"b9b5a58f-c0f4-4e76-88ef-842a2f936ee4\") " pod="barbican-kuttl-tests/barbican-db-sync-nqdbc" Oct 02 09:51:52 crc kubenswrapper[4722]: I1002 09:51:52.007960 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b9b5a58f-c0f4-4e76-88ef-842a2f936ee4-db-sync-config-data\") pod \"barbican-db-sync-nqdbc\" (UID: \"b9b5a58f-c0f4-4e76-88ef-842a2f936ee4\") " pod="barbican-kuttl-tests/barbican-db-sync-nqdbc" Oct 02 09:51:52 crc kubenswrapper[4722]: I1002 09:51:52.022889 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f28c8\" (UniqueName: \"kubernetes.io/projected/b9b5a58f-c0f4-4e76-88ef-842a2f936ee4-kube-api-access-f28c8\") pod \"barbican-db-sync-nqdbc\" (UID: \"b9b5a58f-c0f4-4e76-88ef-842a2f936ee4\") " pod="barbican-kuttl-tests/barbican-db-sync-nqdbc" Oct 02 09:51:52 crc kubenswrapper[4722]: I1002 09:51:52.132502 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-sync-nqdbc" Oct 02 09:51:52 crc kubenswrapper[4722]: I1002 09:51:52.532715 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-nqdbc"] Oct 02 09:51:52 crc kubenswrapper[4722]: I1002 09:51:52.608504 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-sync-nqdbc" event={"ID":"b9b5a58f-c0f4-4e76-88ef-842a2f936ee4","Type":"ContainerStarted","Data":"d7d440f21a1684ccbd4a99dc9b61a268b2cf7333d81b3b8c5b5820c2e96e0c0f"} Oct 02 09:51:53 crc kubenswrapper[4722]: I1002 09:51:53.616726 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-sync-nqdbc" event={"ID":"b9b5a58f-c0f4-4e76-88ef-842a2f936ee4","Type":"ContainerStarted","Data":"813bbab6b0c394b08527dfb0e0d63037078a1a4f044f5d301bcf8ea84be58f04"} Oct 02 09:51:53 crc kubenswrapper[4722]: I1002 09:51:53.637787 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-db-sync-nqdbc" podStartSLOduration=2.637742941 podStartE2EDuration="2.637742941s" podCreationTimestamp="2025-10-02 09:51:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:51:53.633811273 +0000 UTC m=+1169.670564253" watchObservedRunningTime="2025-10-02 09:51:53.637742941 +0000 UTC m=+1169.674495921" Oct 02 09:51:54 crc kubenswrapper[4722]: I1002 09:51:54.639001 4722 generic.go:334] "Generic (PLEG): container finished" podID="b9b5a58f-c0f4-4e76-88ef-842a2f936ee4" containerID="813bbab6b0c394b08527dfb0e0d63037078a1a4f044f5d301bcf8ea84be58f04" exitCode=0 Oct 02 09:51:54 crc kubenswrapper[4722]: I1002 09:51:54.639905 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-sync-nqdbc" event={"ID":"b9b5a58f-c0f4-4e76-88ef-842a2f936ee4","Type":"ContainerDied","Data":"813bbab6b0c394b08527dfb0e0d63037078a1a4f044f5d301bcf8ea84be58f04"} Oct 02 09:51:55 crc kubenswrapper[4722]: I1002 09:51:55.903776 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-sync-nqdbc" Oct 02 09:51:55 crc kubenswrapper[4722]: I1002 09:51:55.958258 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9b5a58f-c0f4-4e76-88ef-842a2f936ee4-combined-ca-bundle\") pod \"b9b5a58f-c0f4-4e76-88ef-842a2f936ee4\" (UID: \"b9b5a58f-c0f4-4e76-88ef-842a2f936ee4\") " Oct 02 09:51:55 crc kubenswrapper[4722]: I1002 09:51:55.958968 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b9b5a58f-c0f4-4e76-88ef-842a2f936ee4-db-sync-config-data\") pod \"b9b5a58f-c0f4-4e76-88ef-842a2f936ee4\" (UID: \"b9b5a58f-c0f4-4e76-88ef-842a2f936ee4\") " Oct 02 09:51:55 crc kubenswrapper[4722]: I1002 09:51:55.959095 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f28c8\" (UniqueName: \"kubernetes.io/projected/b9b5a58f-c0f4-4e76-88ef-842a2f936ee4-kube-api-access-f28c8\") pod \"b9b5a58f-c0f4-4e76-88ef-842a2f936ee4\" (UID: \"b9b5a58f-c0f4-4e76-88ef-842a2f936ee4\") " Oct 02 09:51:55 crc kubenswrapper[4722]: I1002 09:51:55.966000 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9b5a58f-c0f4-4e76-88ef-842a2f936ee4-kube-api-access-f28c8" (OuterVolumeSpecName: "kube-api-access-f28c8") pod "b9b5a58f-c0f4-4e76-88ef-842a2f936ee4" (UID: "b9b5a58f-c0f4-4e76-88ef-842a2f936ee4"). InnerVolumeSpecName "kube-api-access-f28c8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:51:55 crc kubenswrapper[4722]: I1002 09:51:55.973438 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9b5a58f-c0f4-4e76-88ef-842a2f936ee4-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b9b5a58f-c0f4-4e76-88ef-842a2f936ee4" (UID: "b9b5a58f-c0f4-4e76-88ef-842a2f936ee4"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:51:55 crc kubenswrapper[4722]: I1002 09:51:55.984522 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9b5a58f-c0f4-4e76-88ef-842a2f936ee4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b9b5a58f-c0f4-4e76-88ef-842a2f936ee4" (UID: "b9b5a58f-c0f4-4e76-88ef-842a2f936ee4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:51:56 crc kubenswrapper[4722]: I1002 09:51:56.061031 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9b5a58f-c0f4-4e76-88ef-842a2f936ee4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 09:51:56 crc kubenswrapper[4722]: I1002 09:51:56.061081 4722 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b9b5a58f-c0f4-4e76-88ef-842a2f936ee4-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 09:51:56 crc kubenswrapper[4722]: I1002 09:51:56.061092 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f28c8\" (UniqueName: \"kubernetes.io/projected/b9b5a58f-c0f4-4e76-88ef-842a2f936ee4-kube-api-access-f28c8\") on node \"crc\" DevicePath \"\"" Oct 02 09:51:56 crc kubenswrapper[4722]: I1002 09:51:56.653545 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-sync-nqdbc" event={"ID":"b9b5a58f-c0f4-4e76-88ef-842a2f936ee4","Type":"ContainerDied","Data":"d7d440f21a1684ccbd4a99dc9b61a268b2cf7333d81b3b8c5b5820c2e96e0c0f"} Oct 02 09:51:56 crc kubenswrapper[4722]: I1002 09:51:56.653584 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7d440f21a1684ccbd4a99dc9b61a268b2cf7333d81b3b8c5b5820c2e96e0c0f" Oct 02 09:51:56 crc kubenswrapper[4722]: I1002 09:51:56.653604 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-sync-nqdbc" Oct 02 09:51:56 crc kubenswrapper[4722]: I1002 09:51:56.886985 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-worker-69bf8c6599-fz5nx"] Oct 02 09:51:56 crc kubenswrapper[4722]: E1002 09:51:56.887247 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9b5a58f-c0f4-4e76-88ef-842a2f936ee4" containerName="barbican-db-sync" Oct 02 09:51:56 crc kubenswrapper[4722]: I1002 09:51:56.887259 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9b5a58f-c0f4-4e76-88ef-842a2f936ee4" containerName="barbican-db-sync" Oct 02 09:51:56 crc kubenswrapper[4722]: I1002 09:51:56.887416 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9b5a58f-c0f4-4e76-88ef-842a2f936ee4" containerName="barbican-db-sync" Oct 02 09:51:56 crc kubenswrapper[4722]: I1002 09:51:56.888145 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-69bf8c6599-fz5nx" Oct 02 09:51:56 crc kubenswrapper[4722]: I1002 09:51:56.890548 4722 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-barbican-dockercfg-ct4tg" Oct 02 09:51:56 crc kubenswrapper[4722]: I1002 09:51:56.891364 4722 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-worker-config-data" Oct 02 09:51:56 crc kubenswrapper[4722]: I1002 09:51:56.891637 4722 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"combined-ca-bundle" Oct 02 09:51:56 crc kubenswrapper[4722]: I1002 09:51:56.901575 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-worker-69bf8c6599-fz5nx"] Oct 02 09:51:56 crc kubenswrapper[4722]: I1002 09:51:56.909133 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-97f9cf4-24pvp"] Oct 02 09:51:56 crc kubenswrapper[4722]: I1002 09:51:56.910960 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-97f9cf4-24pvp" Oct 02 09:51:56 crc kubenswrapper[4722]: I1002 09:51:56.914776 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-97f9cf4-24pvp"] Oct 02 09:51:56 crc kubenswrapper[4722]: I1002 09:51:56.973293 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7cd624d7-27f9-494c-8c61-b21b245f496f-config-data-custom\") pod \"barbican-keystone-listener-97f9cf4-24pvp\" (UID: \"7cd624d7-27f9-494c-8c61-b21b245f496f\") " pod="barbican-kuttl-tests/barbican-keystone-listener-97f9cf4-24pvp" Oct 02 09:51:56 crc kubenswrapper[4722]: I1002 09:51:56.973377 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2dmg\" (UniqueName: \"kubernetes.io/projected/8b026b07-bc77-4017-a807-4c2f9291f447-kube-api-access-t2dmg\") pod \"barbican-worker-69bf8c6599-fz5nx\" (UID: \"8b026b07-bc77-4017-a807-4c2f9291f447\") " pod="barbican-kuttl-tests/barbican-worker-69bf8c6599-fz5nx" Oct 02 09:51:56 crc kubenswrapper[4722]: I1002 09:51:56.973488 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cd624d7-27f9-494c-8c61-b21b245f496f-combined-ca-bundle\") pod \"barbican-keystone-listener-97f9cf4-24pvp\" (UID: \"7cd624d7-27f9-494c-8c61-b21b245f496f\") " pod="barbican-kuttl-tests/barbican-keystone-listener-97f9cf4-24pvp" Oct 02 09:51:56 crc kubenswrapper[4722]: I1002 09:51:56.973590 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cd624d7-27f9-494c-8c61-b21b245f496f-config-data\") pod \"barbican-keystone-listener-97f9cf4-24pvp\" (UID: \"7cd624d7-27f9-494c-8c61-b21b245f496f\") " pod="barbican-kuttl-tests/barbican-keystone-listener-97f9cf4-24pvp" Oct 02 09:51:56 crc kubenswrapper[4722]: I1002 09:51:56.973617 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cd624d7-27f9-494c-8c61-b21b245f496f-logs\") pod \"barbican-keystone-listener-97f9cf4-24pvp\" (UID: \"7cd624d7-27f9-494c-8c61-b21b245f496f\") " pod="barbican-kuttl-tests/barbican-keystone-listener-97f9cf4-24pvp" Oct 02 09:51:56 crc kubenswrapper[4722]: I1002 09:51:56.973647 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b026b07-bc77-4017-a807-4c2f9291f447-logs\") pod \"barbican-worker-69bf8c6599-fz5nx\" (UID: \"8b026b07-bc77-4017-a807-4c2f9291f447\") " pod="barbican-kuttl-tests/barbican-worker-69bf8c6599-fz5nx" Oct 02 09:51:56 crc kubenswrapper[4722]: I1002 09:51:56.973678 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jdtp\" (UniqueName: \"kubernetes.io/projected/7cd624d7-27f9-494c-8c61-b21b245f496f-kube-api-access-4jdtp\") pod \"barbican-keystone-listener-97f9cf4-24pvp\" (UID: \"7cd624d7-27f9-494c-8c61-b21b245f496f\") " pod="barbican-kuttl-tests/barbican-keystone-listener-97f9cf4-24pvp" Oct 02 09:51:56 crc kubenswrapper[4722]: I1002 09:51:56.973852 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b026b07-bc77-4017-a807-4c2f9291f447-combined-ca-bundle\") pod \"barbican-worker-69bf8c6599-fz5nx\" (UID: \"8b026b07-bc77-4017-a807-4c2f9291f447\") " pod="barbican-kuttl-tests/barbican-worker-69bf8c6599-fz5nx" Oct 02 09:51:56 crc kubenswrapper[4722]: I1002 09:51:56.973993 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b026b07-bc77-4017-a807-4c2f9291f447-config-data-custom\") pod \"barbican-worker-69bf8c6599-fz5nx\" (UID: \"8b026b07-bc77-4017-a807-4c2f9291f447\") " pod="barbican-kuttl-tests/barbican-worker-69bf8c6599-fz5nx" Oct 02 09:51:56 crc kubenswrapper[4722]: I1002 09:51:56.974205 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b026b07-bc77-4017-a807-4c2f9291f447-config-data\") pod \"barbican-worker-69bf8c6599-fz5nx\" (UID: \"8b026b07-bc77-4017-a807-4c2f9291f447\") " pod="barbican-kuttl-tests/barbican-worker-69bf8c6599-fz5nx" Oct 02 09:51:56 crc kubenswrapper[4722]: I1002 09:51:56.999850 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-api-5bdbdb6448-47vp4"] Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.001742 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-5bdbdb6448-47vp4" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.003721 4722 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"cert-barbican-internal-svc" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.004166 4722 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-api-config-data" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.004181 4722 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"cert-barbican-public-svc" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.014779 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-api-5bdbdb6448-47vp4"] Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.075086 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jdtp\" (UniqueName: \"kubernetes.io/projected/7cd624d7-27f9-494c-8c61-b21b245f496f-kube-api-access-4jdtp\") pod \"barbican-keystone-listener-97f9cf4-24pvp\" (UID: \"7cd624d7-27f9-494c-8c61-b21b245f496f\") " pod="barbican-kuttl-tests/barbican-keystone-listener-97f9cf4-24pvp" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.075137 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b026b07-bc77-4017-a807-4c2f9291f447-combined-ca-bundle\") pod \"barbican-worker-69bf8c6599-fz5nx\" (UID: \"8b026b07-bc77-4017-a807-4c2f9291f447\") " pod="barbican-kuttl-tests/barbican-worker-69bf8c6599-fz5nx" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.075172 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b026b07-bc77-4017-a807-4c2f9291f447-config-data-custom\") pod \"barbican-worker-69bf8c6599-fz5nx\" (UID: \"8b026b07-bc77-4017-a807-4c2f9291f447\") " pod="barbican-kuttl-tests/barbican-worker-69bf8c6599-fz5nx" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.075204 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ca9bdee-14e5-440e-9e3e-a976f54c6d75-internal-tls-certs\") pod \"barbican-api-5bdbdb6448-47vp4\" (UID: \"6ca9bdee-14e5-440e-9e3e-a976f54c6d75\") " pod="barbican-kuttl-tests/barbican-api-5bdbdb6448-47vp4" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.075235 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b026b07-bc77-4017-a807-4c2f9291f447-config-data\") pod \"barbican-worker-69bf8c6599-fz5nx\" (UID: \"8b026b07-bc77-4017-a807-4c2f9291f447\") " pod="barbican-kuttl-tests/barbican-worker-69bf8c6599-fz5nx" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.075255 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7cd624d7-27f9-494c-8c61-b21b245f496f-config-data-custom\") pod \"barbican-keystone-listener-97f9cf4-24pvp\" (UID: \"7cd624d7-27f9-494c-8c61-b21b245f496f\") " pod="barbican-kuttl-tests/barbican-keystone-listener-97f9cf4-24pvp" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.075273 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ca9bdee-14e5-440e-9e3e-a976f54c6d75-config-data\") pod \"barbican-api-5bdbdb6448-47vp4\" (UID: \"6ca9bdee-14e5-440e-9e3e-a976f54c6d75\") " pod="barbican-kuttl-tests/barbican-api-5bdbdb6448-47vp4" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.075295 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvv2l\" (UniqueName: \"kubernetes.io/projected/6ca9bdee-14e5-440e-9e3e-a976f54c6d75-kube-api-access-gvv2l\") pod \"barbican-api-5bdbdb6448-47vp4\" (UID: \"6ca9bdee-14e5-440e-9e3e-a976f54c6d75\") " pod="barbican-kuttl-tests/barbican-api-5bdbdb6448-47vp4" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.075347 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ca9bdee-14e5-440e-9e3e-a976f54c6d75-public-tls-certs\") pod \"barbican-api-5bdbdb6448-47vp4\" (UID: \"6ca9bdee-14e5-440e-9e3e-a976f54c6d75\") " pod="barbican-kuttl-tests/barbican-api-5bdbdb6448-47vp4" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.075373 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ca9bdee-14e5-440e-9e3e-a976f54c6d75-combined-ca-bundle\") pod \"barbican-api-5bdbdb6448-47vp4\" (UID: \"6ca9bdee-14e5-440e-9e3e-a976f54c6d75\") " pod="barbican-kuttl-tests/barbican-api-5bdbdb6448-47vp4" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.075401 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2dmg\" (UniqueName: \"kubernetes.io/projected/8b026b07-bc77-4017-a807-4c2f9291f447-kube-api-access-t2dmg\") pod \"barbican-worker-69bf8c6599-fz5nx\" (UID: \"8b026b07-bc77-4017-a807-4c2f9291f447\") " pod="barbican-kuttl-tests/barbican-worker-69bf8c6599-fz5nx" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.075425 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ca9bdee-14e5-440e-9e3e-a976f54c6d75-config-data-custom\") pod \"barbican-api-5bdbdb6448-47vp4\" (UID: \"6ca9bdee-14e5-440e-9e3e-a976f54c6d75\") " pod="barbican-kuttl-tests/barbican-api-5bdbdb6448-47vp4" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.075449 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cd624d7-27f9-494c-8c61-b21b245f496f-combined-ca-bundle\") pod \"barbican-keystone-listener-97f9cf4-24pvp\" (UID: \"7cd624d7-27f9-494c-8c61-b21b245f496f\") " pod="barbican-kuttl-tests/barbican-keystone-listener-97f9cf4-24pvp" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.075496 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ca9bdee-14e5-440e-9e3e-a976f54c6d75-logs\") pod \"barbican-api-5bdbdb6448-47vp4\" (UID: \"6ca9bdee-14e5-440e-9e3e-a976f54c6d75\") " pod="barbican-kuttl-tests/barbican-api-5bdbdb6448-47vp4" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.075522 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cd624d7-27f9-494c-8c61-b21b245f496f-config-data\") pod \"barbican-keystone-listener-97f9cf4-24pvp\" (UID: \"7cd624d7-27f9-494c-8c61-b21b245f496f\") " pod="barbican-kuttl-tests/barbican-keystone-listener-97f9cf4-24pvp" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.075548 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cd624d7-27f9-494c-8c61-b21b245f496f-logs\") pod \"barbican-keystone-listener-97f9cf4-24pvp\" (UID: \"7cd624d7-27f9-494c-8c61-b21b245f496f\") " pod="barbican-kuttl-tests/barbican-keystone-listener-97f9cf4-24pvp" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.075568 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b026b07-bc77-4017-a807-4c2f9291f447-logs\") pod \"barbican-worker-69bf8c6599-fz5nx\" (UID: \"8b026b07-bc77-4017-a807-4c2f9291f447\") " pod="barbican-kuttl-tests/barbican-worker-69bf8c6599-fz5nx" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.075923 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cd624d7-27f9-494c-8c61-b21b245f496f-logs\") pod \"barbican-keystone-listener-97f9cf4-24pvp\" (UID: \"7cd624d7-27f9-494c-8c61-b21b245f496f\") " pod="barbican-kuttl-tests/barbican-keystone-listener-97f9cf4-24pvp" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.076008 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b026b07-bc77-4017-a807-4c2f9291f447-logs\") pod \"barbican-worker-69bf8c6599-fz5nx\" (UID: \"8b026b07-bc77-4017-a807-4c2f9291f447\") " pod="barbican-kuttl-tests/barbican-worker-69bf8c6599-fz5nx" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.079760 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cd624d7-27f9-494c-8c61-b21b245f496f-combined-ca-bundle\") pod \"barbican-keystone-listener-97f9cf4-24pvp\" (UID: \"7cd624d7-27f9-494c-8c61-b21b245f496f\") " pod="barbican-kuttl-tests/barbican-keystone-listener-97f9cf4-24pvp" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.079796 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b026b07-bc77-4017-a807-4c2f9291f447-config-data-custom\") pod \"barbican-worker-69bf8c6599-fz5nx\" (UID: \"8b026b07-bc77-4017-a807-4c2f9291f447\") " pod="barbican-kuttl-tests/barbican-worker-69bf8c6599-fz5nx" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.083784 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b026b07-bc77-4017-a807-4c2f9291f447-combined-ca-bundle\") pod \"barbican-worker-69bf8c6599-fz5nx\" (UID: \"8b026b07-bc77-4017-a807-4c2f9291f447\") " pod="barbican-kuttl-tests/barbican-worker-69bf8c6599-fz5nx" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.085532 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b026b07-bc77-4017-a807-4c2f9291f447-config-data\") pod \"barbican-worker-69bf8c6599-fz5nx\" (UID: \"8b026b07-bc77-4017-a807-4c2f9291f447\") " pod="barbican-kuttl-tests/barbican-worker-69bf8c6599-fz5nx" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.087536 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cd624d7-27f9-494c-8c61-b21b245f496f-config-data\") pod \"barbican-keystone-listener-97f9cf4-24pvp\" (UID: \"7cd624d7-27f9-494c-8c61-b21b245f496f\") " pod="barbican-kuttl-tests/barbican-keystone-listener-97f9cf4-24pvp" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.087718 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7cd624d7-27f9-494c-8c61-b21b245f496f-config-data-custom\") pod \"barbican-keystone-listener-97f9cf4-24pvp\" (UID: \"7cd624d7-27f9-494c-8c61-b21b245f496f\") " pod="barbican-kuttl-tests/barbican-keystone-listener-97f9cf4-24pvp" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.092546 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2dmg\" (UniqueName: \"kubernetes.io/projected/8b026b07-bc77-4017-a807-4c2f9291f447-kube-api-access-t2dmg\") pod \"barbican-worker-69bf8c6599-fz5nx\" (UID: \"8b026b07-bc77-4017-a807-4c2f9291f447\") " pod="barbican-kuttl-tests/barbican-worker-69bf8c6599-fz5nx" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.097441 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jdtp\" (UniqueName: \"kubernetes.io/projected/7cd624d7-27f9-494c-8c61-b21b245f496f-kube-api-access-4jdtp\") pod \"barbican-keystone-listener-97f9cf4-24pvp\" (UID: \"7cd624d7-27f9-494c-8c61-b21b245f496f\") " pod="barbican-kuttl-tests/barbican-keystone-listener-97f9cf4-24pvp" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.177030 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ca9bdee-14e5-440e-9e3e-a976f54c6d75-logs\") pod \"barbican-api-5bdbdb6448-47vp4\" (UID: \"6ca9bdee-14e5-440e-9e3e-a976f54c6d75\") " pod="barbican-kuttl-tests/barbican-api-5bdbdb6448-47vp4" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.177547 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ca9bdee-14e5-440e-9e3e-a976f54c6d75-internal-tls-certs\") pod \"barbican-api-5bdbdb6448-47vp4\" (UID: \"6ca9bdee-14e5-440e-9e3e-a976f54c6d75\") " pod="barbican-kuttl-tests/barbican-api-5bdbdb6448-47vp4" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.177587 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ca9bdee-14e5-440e-9e3e-a976f54c6d75-config-data\") pod \"barbican-api-5bdbdb6448-47vp4\" (UID: \"6ca9bdee-14e5-440e-9e3e-a976f54c6d75\") " pod="barbican-kuttl-tests/barbican-api-5bdbdb6448-47vp4" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.177607 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvv2l\" (UniqueName: \"kubernetes.io/projected/6ca9bdee-14e5-440e-9e3e-a976f54c6d75-kube-api-access-gvv2l\") pod \"barbican-api-5bdbdb6448-47vp4\" (UID: \"6ca9bdee-14e5-440e-9e3e-a976f54c6d75\") " pod="barbican-kuttl-tests/barbican-api-5bdbdb6448-47vp4" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.177628 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ca9bdee-14e5-440e-9e3e-a976f54c6d75-public-tls-certs\") pod \"barbican-api-5bdbdb6448-47vp4\" (UID: \"6ca9bdee-14e5-440e-9e3e-a976f54c6d75\") " pod="barbican-kuttl-tests/barbican-api-5bdbdb6448-47vp4" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.177645 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ca9bdee-14e5-440e-9e3e-a976f54c6d75-combined-ca-bundle\") pod \"barbican-api-5bdbdb6448-47vp4\" (UID: \"6ca9bdee-14e5-440e-9e3e-a976f54c6d75\") " pod="barbican-kuttl-tests/barbican-api-5bdbdb6448-47vp4" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.177672 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ca9bdee-14e5-440e-9e3e-a976f54c6d75-config-data-custom\") pod \"barbican-api-5bdbdb6448-47vp4\" (UID: \"6ca9bdee-14e5-440e-9e3e-a976f54c6d75\") " pod="barbican-kuttl-tests/barbican-api-5bdbdb6448-47vp4" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.177464 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ca9bdee-14e5-440e-9e3e-a976f54c6d75-logs\") pod \"barbican-api-5bdbdb6448-47vp4\" (UID: \"6ca9bdee-14e5-440e-9e3e-a976f54c6d75\") " pod="barbican-kuttl-tests/barbican-api-5bdbdb6448-47vp4" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.181564 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ca9bdee-14e5-440e-9e3e-a976f54c6d75-public-tls-certs\") pod \"barbican-api-5bdbdb6448-47vp4\" (UID: \"6ca9bdee-14e5-440e-9e3e-a976f54c6d75\") " pod="barbican-kuttl-tests/barbican-api-5bdbdb6448-47vp4" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.182938 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ca9bdee-14e5-440e-9e3e-a976f54c6d75-config-data\") pod \"barbican-api-5bdbdb6448-47vp4\" (UID: \"6ca9bdee-14e5-440e-9e3e-a976f54c6d75\") " pod="barbican-kuttl-tests/barbican-api-5bdbdb6448-47vp4" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.183094 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ca9bdee-14e5-440e-9e3e-a976f54c6d75-internal-tls-certs\") pod \"barbican-api-5bdbdb6448-47vp4\" (UID: \"6ca9bdee-14e5-440e-9e3e-a976f54c6d75\") " pod="barbican-kuttl-tests/barbican-api-5bdbdb6448-47vp4" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.183250 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ca9bdee-14e5-440e-9e3e-a976f54c6d75-config-data-custom\") pod \"barbican-api-5bdbdb6448-47vp4\" (UID: \"6ca9bdee-14e5-440e-9e3e-a976f54c6d75\") " pod="barbican-kuttl-tests/barbican-api-5bdbdb6448-47vp4" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.189725 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ca9bdee-14e5-440e-9e3e-a976f54c6d75-combined-ca-bundle\") pod \"barbican-api-5bdbdb6448-47vp4\" (UID: \"6ca9bdee-14e5-440e-9e3e-a976f54c6d75\") " pod="barbican-kuttl-tests/barbican-api-5bdbdb6448-47vp4" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.198272 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvv2l\" (UniqueName: \"kubernetes.io/projected/6ca9bdee-14e5-440e-9e3e-a976f54c6d75-kube-api-access-gvv2l\") pod \"barbican-api-5bdbdb6448-47vp4\" (UID: \"6ca9bdee-14e5-440e-9e3e-a976f54c6d75\") " pod="barbican-kuttl-tests/barbican-api-5bdbdb6448-47vp4" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.204151 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-69bf8c6599-fz5nx" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.225699 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-97f9cf4-24pvp" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.315875 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-5bdbdb6448-47vp4" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.557562 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-97f9cf4-24pvp"] Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.614654 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-api-5bdbdb6448-47vp4"] Oct 02 09:51:57 crc kubenswrapper[4722]: W1002 09:51:57.620168 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ca9bdee_14e5_440e_9e3e_a976f54c6d75.slice/crio-5bf363522cb1f5ee5a1dff22314711e98ce0738cbd90a33164e6756bf7f6486c WatchSource:0}: Error finding container 5bf363522cb1f5ee5a1dff22314711e98ce0738cbd90a33164e6756bf7f6486c: Status 404 returned error can't find the container with id 5bf363522cb1f5ee5a1dff22314711e98ce0738cbd90a33164e6756bf7f6486c Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.666375 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-5bdbdb6448-47vp4" event={"ID":"6ca9bdee-14e5-440e-9e3e-a976f54c6d75","Type":"ContainerStarted","Data":"5bf363522cb1f5ee5a1dff22314711e98ce0738cbd90a33164e6756bf7f6486c"} Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.672118 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-97f9cf4-24pvp" event={"ID":"7cd624d7-27f9-494c-8c61-b21b245f496f","Type":"ContainerStarted","Data":"09c0fc3b2eb35da1301df20b45cc87678c307cadbfcd91da1b86f8f315cb31d5"} Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.678899 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-worker-69bf8c6599-fz5nx"] Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.774969 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-nqdbc"] Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.784743 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-nqdbc"] Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.807447 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbicanc513-account-delete-ndb4h"] Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.809502 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbicanc513-account-delete-ndb4h" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.816170 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbicanc513-account-delete-ndb4h"] Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.875123 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-97f9cf4-24pvp"] Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.889199 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-worker-69bf8c6599-fz5nx"] Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.891472 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq2x8\" (UniqueName: \"kubernetes.io/projected/9efa47a4-66d0-4f08-a57f-3bc68b601e0b-kube-api-access-pq2x8\") pod \"barbicanc513-account-delete-ndb4h\" (UID: \"9efa47a4-66d0-4f08-a57f-3bc68b601e0b\") " pod="barbican-kuttl-tests/barbicanc513-account-delete-ndb4h" Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.933511 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-api-5bdbdb6448-47vp4"] Oct 02 09:51:57 crc kubenswrapper[4722]: I1002 09:51:57.993000 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq2x8\" (UniqueName: \"kubernetes.io/projected/9efa47a4-66d0-4f08-a57f-3bc68b601e0b-kube-api-access-pq2x8\") pod \"barbicanc513-account-delete-ndb4h\" (UID: \"9efa47a4-66d0-4f08-a57f-3bc68b601e0b\") " pod="barbican-kuttl-tests/barbicanc513-account-delete-ndb4h" Oct 02 09:51:58 crc kubenswrapper[4722]: I1002 09:51:58.018049 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq2x8\" (UniqueName: \"kubernetes.io/projected/9efa47a4-66d0-4f08-a57f-3bc68b601e0b-kube-api-access-pq2x8\") pod \"barbicanc513-account-delete-ndb4h\" (UID: \"9efa47a4-66d0-4f08-a57f-3bc68b601e0b\") " pod="barbican-kuttl-tests/barbicanc513-account-delete-ndb4h" Oct 02 09:51:58 crc kubenswrapper[4722]: I1002 09:51:58.197463 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbicanc513-account-delete-ndb4h" Oct 02 09:51:58 crc kubenswrapper[4722]: I1002 09:51:58.604708 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbicanc513-account-delete-ndb4h"] Oct 02 09:51:58 crc kubenswrapper[4722]: I1002 09:51:58.678795 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbicanc513-account-delete-ndb4h" event={"ID":"9efa47a4-66d0-4f08-a57f-3bc68b601e0b","Type":"ContainerStarted","Data":"68f95fa6b9c2b81a6bc27ccfdf7bee821deaf2df6c315d3c53d7bfa65e150edf"} Oct 02 09:51:58 crc kubenswrapper[4722]: I1002 09:51:58.680835 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-5bdbdb6448-47vp4" event={"ID":"6ca9bdee-14e5-440e-9e3e-a976f54c6d75","Type":"ContainerStarted","Data":"ef7a8d14e52e2d763ba650cdb104d6cd56579439654af19c0284a9e245d4087e"} Oct 02 09:51:58 crc kubenswrapper[4722]: I1002 09:51:58.680866 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-5bdbdb6448-47vp4" event={"ID":"6ca9bdee-14e5-440e-9e3e-a976f54c6d75","Type":"ContainerStarted","Data":"8b691b3309424f71d68f6bd989784b4a560d4a733bd88c92bdbacb7f5bf5cfa8"} Oct 02 09:51:58 crc kubenswrapper[4722]: I1002 09:51:58.681001 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-api-5bdbdb6448-47vp4" podUID="6ca9bdee-14e5-440e-9e3e-a976f54c6d75" containerName="barbican-api-log" containerID="cri-o://8b691b3309424f71d68f6bd989784b4a560d4a733bd88c92bdbacb7f5bf5cfa8" gracePeriod=30 Oct 02 09:51:58 crc kubenswrapper[4722]: I1002 09:51:58.681023 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-api-5bdbdb6448-47vp4" podUID="6ca9bdee-14e5-440e-9e3e-a976f54c6d75" containerName="barbican-api" containerID="cri-o://ef7a8d14e52e2d763ba650cdb104d6cd56579439654af19c0284a9e245d4087e" gracePeriod=30 Oct 02 09:51:58 crc kubenswrapper[4722]: I1002 09:51:58.681180 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/barbican-api-5bdbdb6448-47vp4" Oct 02 09:51:58 crc kubenswrapper[4722]: I1002 09:51:58.683810 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-97f9cf4-24pvp" event={"ID":"7cd624d7-27f9-494c-8c61-b21b245f496f","Type":"ContainerStarted","Data":"68437dea428f2d3603c7b503e46cf7e42bb80b55d07a41e9be623f94329cf6a8"} Oct 02 09:51:58 crc kubenswrapper[4722]: I1002 09:51:58.683858 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-97f9cf4-24pvp" event={"ID":"7cd624d7-27f9-494c-8c61-b21b245f496f","Type":"ContainerStarted","Data":"3a4644b019e699b79b93bbe0f9a39f51515b7a6a996a7734909e27525b001baf"} Oct 02 09:51:58 crc kubenswrapper[4722]: I1002 09:51:58.684009 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-keystone-listener-97f9cf4-24pvp" podUID="7cd624d7-27f9-494c-8c61-b21b245f496f" containerName="barbican-keystone-listener-log" containerID="cri-o://3a4644b019e699b79b93bbe0f9a39f51515b7a6a996a7734909e27525b001baf" gracePeriod=30 Oct 02 09:51:58 crc kubenswrapper[4722]: I1002 09:51:58.684084 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-keystone-listener-97f9cf4-24pvp" podUID="7cd624d7-27f9-494c-8c61-b21b245f496f" containerName="barbican-keystone-listener" containerID="cri-o://68437dea428f2d3603c7b503e46cf7e42bb80b55d07a41e9be623f94329cf6a8" gracePeriod=30 Oct 02 09:51:58 crc kubenswrapper[4722]: I1002 09:51:58.693278 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-69bf8c6599-fz5nx" event={"ID":"8b026b07-bc77-4017-a807-4c2f9291f447","Type":"ContainerStarted","Data":"4a5d603d3231c58a45f795e5d4c5258e7462ea5deef2fed5723752b00065f2f8"} Oct 02 09:51:58 crc kubenswrapper[4722]: I1002 09:51:58.693351 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-69bf8c6599-fz5nx" event={"ID":"8b026b07-bc77-4017-a807-4c2f9291f447","Type":"ContainerStarted","Data":"74d06a6dcd7c6308970c26eb331d1699511930a2a57bc527e0ebf3b3659de1e4"} Oct 02 09:51:58 crc kubenswrapper[4722]: I1002 09:51:58.693366 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-69bf8c6599-fz5nx" event={"ID":"8b026b07-bc77-4017-a807-4c2f9291f447","Type":"ContainerStarted","Data":"e54e260dbb7046f394c8e64cc636ce9ee1727328f56f94b931242cddfddec3b7"} Oct 02 09:51:58 crc kubenswrapper[4722]: I1002 09:51:58.693524 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-worker-69bf8c6599-fz5nx" podUID="8b026b07-bc77-4017-a807-4c2f9291f447" containerName="barbican-worker" containerID="cri-o://4a5d603d3231c58a45f795e5d4c5258e7462ea5deef2fed5723752b00065f2f8" gracePeriod=30 Oct 02 09:51:58 crc kubenswrapper[4722]: I1002 09:51:58.693529 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-worker-69bf8c6599-fz5nx" podUID="8b026b07-bc77-4017-a807-4c2f9291f447" containerName="barbican-worker-log" containerID="cri-o://74d06a6dcd7c6308970c26eb331d1699511930a2a57bc527e0ebf3b3659de1e4" gracePeriod=30 Oct 02 09:51:58 crc kubenswrapper[4722]: I1002 09:51:58.725218 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-api-5bdbdb6448-47vp4" podStartSLOduration=2.7251971790000002 podStartE2EDuration="2.725197179s" podCreationTimestamp="2025-10-02 09:51:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:51:58.717511391 +0000 UTC m=+1174.754264381" watchObservedRunningTime="2025-10-02 09:51:58.725197179 +0000 UTC m=+1174.761950159" Oct 02 09:51:58 crc kubenswrapper[4722]: I1002 09:51:58.726100 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9b5a58f-c0f4-4e76-88ef-842a2f936ee4" path="/var/lib/kubelet/pods/b9b5a58f-c0f4-4e76-88ef-842a2f936ee4/volumes" Oct 02 09:51:58 crc kubenswrapper[4722]: I1002 09:51:58.740071 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-keystone-listener-97f9cf4-24pvp" podStartSLOduration=2.74005073 podStartE2EDuration="2.74005073s" podCreationTimestamp="2025-10-02 09:51:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:51:58.737059043 +0000 UTC m=+1174.773812043" watchObservedRunningTime="2025-10-02 09:51:58.74005073 +0000 UTC m=+1174.776803730" Oct 02 09:51:58 crc kubenswrapper[4722]: I1002 09:51:58.757092 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-worker-69bf8c6599-fz5nx" podStartSLOduration=2.757073847 podStartE2EDuration="2.757073847s" podCreationTimestamp="2025-10-02 09:51:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:51:58.753836744 +0000 UTC m=+1174.790589724" watchObservedRunningTime="2025-10-02 09:51:58.757073847 +0000 UTC m=+1174.793826817" Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.188100 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-5bdbdb6448-47vp4" Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.309879 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ca9bdee-14e5-440e-9e3e-a976f54c6d75-internal-tls-certs\") pod \"6ca9bdee-14e5-440e-9e3e-a976f54c6d75\" (UID: \"6ca9bdee-14e5-440e-9e3e-a976f54c6d75\") " Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.310604 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvv2l\" (UniqueName: \"kubernetes.io/projected/6ca9bdee-14e5-440e-9e3e-a976f54c6d75-kube-api-access-gvv2l\") pod \"6ca9bdee-14e5-440e-9e3e-a976f54c6d75\" (UID: \"6ca9bdee-14e5-440e-9e3e-a976f54c6d75\") " Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.310727 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ca9bdee-14e5-440e-9e3e-a976f54c6d75-combined-ca-bundle\") pod \"6ca9bdee-14e5-440e-9e3e-a976f54c6d75\" (UID: \"6ca9bdee-14e5-440e-9e3e-a976f54c6d75\") " Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.310852 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ca9bdee-14e5-440e-9e3e-a976f54c6d75-config-data-custom\") pod \"6ca9bdee-14e5-440e-9e3e-a976f54c6d75\" (UID: \"6ca9bdee-14e5-440e-9e3e-a976f54c6d75\") " Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.310977 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ca9bdee-14e5-440e-9e3e-a976f54c6d75-config-data\") pod \"6ca9bdee-14e5-440e-9e3e-a976f54c6d75\" (UID: \"6ca9bdee-14e5-440e-9e3e-a976f54c6d75\") " Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.311306 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ca9bdee-14e5-440e-9e3e-a976f54c6d75-logs\") pod \"6ca9bdee-14e5-440e-9e3e-a976f54c6d75\" (UID: \"6ca9bdee-14e5-440e-9e3e-a976f54c6d75\") " Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.311533 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ca9bdee-14e5-440e-9e3e-a976f54c6d75-public-tls-certs\") pod \"6ca9bdee-14e5-440e-9e3e-a976f54c6d75\" (UID: \"6ca9bdee-14e5-440e-9e3e-a976f54c6d75\") " Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.311671 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ca9bdee-14e5-440e-9e3e-a976f54c6d75-logs" (OuterVolumeSpecName: "logs") pod "6ca9bdee-14e5-440e-9e3e-a976f54c6d75" (UID: "6ca9bdee-14e5-440e-9e3e-a976f54c6d75"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.312035 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ca9bdee-14e5-440e-9e3e-a976f54c6d75-logs\") on node \"crc\" DevicePath \"\"" Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.316577 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ca9bdee-14e5-440e-9e3e-a976f54c6d75-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6ca9bdee-14e5-440e-9e3e-a976f54c6d75" (UID: "6ca9bdee-14e5-440e-9e3e-a976f54c6d75"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.330482 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ca9bdee-14e5-440e-9e3e-a976f54c6d75-kube-api-access-gvv2l" (OuterVolumeSpecName: "kube-api-access-gvv2l") pod "6ca9bdee-14e5-440e-9e3e-a976f54c6d75" (UID: "6ca9bdee-14e5-440e-9e3e-a976f54c6d75"). InnerVolumeSpecName "kube-api-access-gvv2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.353472 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ca9bdee-14e5-440e-9e3e-a976f54c6d75-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ca9bdee-14e5-440e-9e3e-a976f54c6d75" (UID: "6ca9bdee-14e5-440e-9e3e-a976f54c6d75"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.354819 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ca9bdee-14e5-440e-9e3e-a976f54c6d75-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6ca9bdee-14e5-440e-9e3e-a976f54c6d75" (UID: "6ca9bdee-14e5-440e-9e3e-a976f54c6d75"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.355410 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ca9bdee-14e5-440e-9e3e-a976f54c6d75-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6ca9bdee-14e5-440e-9e3e-a976f54c6d75" (UID: "6ca9bdee-14e5-440e-9e3e-a976f54c6d75"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.365304 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ca9bdee-14e5-440e-9e3e-a976f54c6d75-config-data" (OuterVolumeSpecName: "config-data") pod "6ca9bdee-14e5-440e-9e3e-a976f54c6d75" (UID: "6ca9bdee-14e5-440e-9e3e-a976f54c6d75"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.413984 4722 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ca9bdee-14e5-440e-9e3e-a976f54c6d75-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.414030 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvv2l\" (UniqueName: \"kubernetes.io/projected/6ca9bdee-14e5-440e-9e3e-a976f54c6d75-kube-api-access-gvv2l\") on node \"crc\" DevicePath \"\"" Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.414043 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ca9bdee-14e5-440e-9e3e-a976f54c6d75-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.414066 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ca9bdee-14e5-440e-9e3e-a976f54c6d75-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.414076 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ca9bdee-14e5-440e-9e3e-a976f54c6d75-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.414085 4722 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ca9bdee-14e5-440e-9e3e-a976f54c6d75-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.703074 4722 generic.go:334] "Generic (PLEG): container finished" podID="9efa47a4-66d0-4f08-a57f-3bc68b601e0b" containerID="23a74c9ce6b0ff4df8f7d06b59f6b2240d95d5d712ca55a741bd3392a648e7e4" exitCode=0 Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.703176 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbicanc513-account-delete-ndb4h" event={"ID":"9efa47a4-66d0-4f08-a57f-3bc68b601e0b","Type":"ContainerDied","Data":"23a74c9ce6b0ff4df8f7d06b59f6b2240d95d5d712ca55a741bd3392a648e7e4"} Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.716834 4722 generic.go:334] "Generic (PLEG): container finished" podID="6ca9bdee-14e5-440e-9e3e-a976f54c6d75" containerID="ef7a8d14e52e2d763ba650cdb104d6cd56579439654af19c0284a9e245d4087e" exitCode=0 Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.716874 4722 generic.go:334] "Generic (PLEG): container finished" podID="6ca9bdee-14e5-440e-9e3e-a976f54c6d75" containerID="8b691b3309424f71d68f6bd989784b4a560d4a733bd88c92bdbacb7f5bf5cfa8" exitCode=143 Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.716905 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-5bdbdb6448-47vp4" Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.716926 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-5bdbdb6448-47vp4" event={"ID":"6ca9bdee-14e5-440e-9e3e-a976f54c6d75","Type":"ContainerDied","Data":"ef7a8d14e52e2d763ba650cdb104d6cd56579439654af19c0284a9e245d4087e"} Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.716952 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-5bdbdb6448-47vp4" event={"ID":"6ca9bdee-14e5-440e-9e3e-a976f54c6d75","Type":"ContainerDied","Data":"8b691b3309424f71d68f6bd989784b4a560d4a733bd88c92bdbacb7f5bf5cfa8"} Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.716961 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-5bdbdb6448-47vp4" event={"ID":"6ca9bdee-14e5-440e-9e3e-a976f54c6d75","Type":"ContainerDied","Data":"5bf363522cb1f5ee5a1dff22314711e98ce0738cbd90a33164e6756bf7f6486c"} Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.716976 4722 scope.go:117] "RemoveContainer" containerID="ef7a8d14e52e2d763ba650cdb104d6cd56579439654af19c0284a9e245d4087e" Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.722574 4722 generic.go:334] "Generic (PLEG): container finished" podID="837089fe-a05d-42fb-913c-41a52ed8e2a5" containerID="6bc3f10e6ffbcfeb6d2aefe03a065cf85c1b0235f4091caeb2b3a328fec22120" exitCode=137 Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.722637 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-mqwgz" event={"ID":"837089fe-a05d-42fb-913c-41a52ed8e2a5","Type":"ContainerDied","Data":"6bc3f10e6ffbcfeb6d2aefe03a065cf85c1b0235f4091caeb2b3a328fec22120"} Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.722675 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-mqwgz" event={"ID":"837089fe-a05d-42fb-913c-41a52ed8e2a5","Type":"ContainerDied","Data":"3925a05463ae35c7c96f98144aba31183e7074c93d6909da2824e864a13e4198"} Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.722700 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3925a05463ae35c7c96f98144aba31183e7074c93d6909da2824e864a13e4198" Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.724913 4722 generic.go:334] "Generic (PLEG): container finished" podID="7cd624d7-27f9-494c-8c61-b21b245f496f" containerID="3a4644b019e699b79b93bbe0f9a39f51515b7a6a996a7734909e27525b001baf" exitCode=143 Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.724957 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-97f9cf4-24pvp" event={"ID":"7cd624d7-27f9-494c-8c61-b21b245f496f","Type":"ContainerDied","Data":"3a4644b019e699b79b93bbe0f9a39f51515b7a6a996a7734909e27525b001baf"} Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.744290 4722 generic.go:334] "Generic (PLEG): container finished" podID="8b026b07-bc77-4017-a807-4c2f9291f447" containerID="74d06a6dcd7c6308970c26eb331d1699511930a2a57bc527e0ebf3b3659de1e4" exitCode=143 Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.744346 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-69bf8c6599-fz5nx" event={"ID":"8b026b07-bc77-4017-a807-4c2f9291f447","Type":"ContainerDied","Data":"74d06a6dcd7c6308970c26eb331d1699511930a2a57bc527e0ebf3b3659de1e4"} Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.763971 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-mqwgz" Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.791710 4722 scope.go:117] "RemoveContainer" containerID="8b691b3309424f71d68f6bd989784b4a560d4a733bd88c92bdbacb7f5bf5cfa8" Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.803696 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-api-5bdbdb6448-47vp4"] Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.811000 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-api-5bdbdb6448-47vp4"] Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.821574 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/837089fe-a05d-42fb-913c-41a52ed8e2a5-logs\") pod \"837089fe-a05d-42fb-913c-41a52ed8e2a5\" (UID: \"837089fe-a05d-42fb-913c-41a52ed8e2a5\") " Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.821666 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/837089fe-a05d-42fb-913c-41a52ed8e2a5-config-data\") pod \"837089fe-a05d-42fb-913c-41a52ed8e2a5\" (UID: \"837089fe-a05d-42fb-913c-41a52ed8e2a5\") " Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.821713 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/837089fe-a05d-42fb-913c-41a52ed8e2a5-config-data-custom\") pod \"837089fe-a05d-42fb-913c-41a52ed8e2a5\" (UID: \"837089fe-a05d-42fb-913c-41a52ed8e2a5\") " Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.821764 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nh7m\" (UniqueName: \"kubernetes.io/projected/837089fe-a05d-42fb-913c-41a52ed8e2a5-kube-api-access-7nh7m\") pod \"837089fe-a05d-42fb-913c-41a52ed8e2a5\" (UID: \"837089fe-a05d-42fb-913c-41a52ed8e2a5\") " Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.821963 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/837089fe-a05d-42fb-913c-41a52ed8e2a5-logs" (OuterVolumeSpecName: "logs") pod "837089fe-a05d-42fb-913c-41a52ed8e2a5" (UID: "837089fe-a05d-42fb-913c-41a52ed8e2a5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.822223 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/837089fe-a05d-42fb-913c-41a52ed8e2a5-logs\") on node \"crc\" DevicePath \"\"" Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.826691 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/837089fe-a05d-42fb-913c-41a52ed8e2a5-kube-api-access-7nh7m" (OuterVolumeSpecName: "kube-api-access-7nh7m") pod "837089fe-a05d-42fb-913c-41a52ed8e2a5" (UID: "837089fe-a05d-42fb-913c-41a52ed8e2a5"). InnerVolumeSpecName "kube-api-access-7nh7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.828837 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/837089fe-a05d-42fb-913c-41a52ed8e2a5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "837089fe-a05d-42fb-913c-41a52ed8e2a5" (UID: "837089fe-a05d-42fb-913c-41a52ed8e2a5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.840277 4722 scope.go:117] "RemoveContainer" containerID="ef7a8d14e52e2d763ba650cdb104d6cd56579439654af19c0284a9e245d4087e" Oct 02 09:51:59 crc kubenswrapper[4722]: E1002 09:51:59.842627 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef7a8d14e52e2d763ba650cdb104d6cd56579439654af19c0284a9e245d4087e\": container with ID starting with ef7a8d14e52e2d763ba650cdb104d6cd56579439654af19c0284a9e245d4087e not found: ID does not exist" containerID="ef7a8d14e52e2d763ba650cdb104d6cd56579439654af19c0284a9e245d4087e" Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.842679 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef7a8d14e52e2d763ba650cdb104d6cd56579439654af19c0284a9e245d4087e"} err="failed to get container status \"ef7a8d14e52e2d763ba650cdb104d6cd56579439654af19c0284a9e245d4087e\": rpc error: code = NotFound desc = could not find container \"ef7a8d14e52e2d763ba650cdb104d6cd56579439654af19c0284a9e245d4087e\": container with ID starting with ef7a8d14e52e2d763ba650cdb104d6cd56579439654af19c0284a9e245d4087e not found: ID does not exist" Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.842713 4722 scope.go:117] "RemoveContainer" containerID="8b691b3309424f71d68f6bd989784b4a560d4a733bd88c92bdbacb7f5bf5cfa8" Oct 02 09:51:59 crc kubenswrapper[4722]: E1002 09:51:59.845740 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b691b3309424f71d68f6bd989784b4a560d4a733bd88c92bdbacb7f5bf5cfa8\": container with ID starting with 8b691b3309424f71d68f6bd989784b4a560d4a733bd88c92bdbacb7f5bf5cfa8 not found: ID does not exist" containerID="8b691b3309424f71d68f6bd989784b4a560d4a733bd88c92bdbacb7f5bf5cfa8" Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.845790 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b691b3309424f71d68f6bd989784b4a560d4a733bd88c92bdbacb7f5bf5cfa8"} err="failed to get container status \"8b691b3309424f71d68f6bd989784b4a560d4a733bd88c92bdbacb7f5bf5cfa8\": rpc error: code = NotFound desc = could not find container \"8b691b3309424f71d68f6bd989784b4a560d4a733bd88c92bdbacb7f5bf5cfa8\": container with ID starting with 8b691b3309424f71d68f6bd989784b4a560d4a733bd88c92bdbacb7f5bf5cfa8 not found: ID does not exist" Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.845820 4722 scope.go:117] "RemoveContainer" containerID="ef7a8d14e52e2d763ba650cdb104d6cd56579439654af19c0284a9e245d4087e" Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.846407 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef7a8d14e52e2d763ba650cdb104d6cd56579439654af19c0284a9e245d4087e"} err="failed to get container status \"ef7a8d14e52e2d763ba650cdb104d6cd56579439654af19c0284a9e245d4087e\": rpc error: code = NotFound desc = could not find container \"ef7a8d14e52e2d763ba650cdb104d6cd56579439654af19c0284a9e245d4087e\": container with ID starting with ef7a8d14e52e2d763ba650cdb104d6cd56579439654af19c0284a9e245d4087e not found: ID does not exist" Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.846452 4722 scope.go:117] "RemoveContainer" containerID="8b691b3309424f71d68f6bd989784b4a560d4a733bd88c92bdbacb7f5bf5cfa8" Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.847034 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b691b3309424f71d68f6bd989784b4a560d4a733bd88c92bdbacb7f5bf5cfa8"} err="failed to get container status \"8b691b3309424f71d68f6bd989784b4a560d4a733bd88c92bdbacb7f5bf5cfa8\": rpc error: code = NotFound desc = could not find container \"8b691b3309424f71d68f6bd989784b4a560d4a733bd88c92bdbacb7f5bf5cfa8\": container with ID starting with 8b691b3309424f71d68f6bd989784b4a560d4a733bd88c92bdbacb7f5bf5cfa8 not found: ID does not exist" Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.883440 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/837089fe-a05d-42fb-913c-41a52ed8e2a5-config-data" (OuterVolumeSpecName: "config-data") pod "837089fe-a05d-42fb-913c-41a52ed8e2a5" (UID: "837089fe-a05d-42fb-913c-41a52ed8e2a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.927318 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/837089fe-a05d-42fb-913c-41a52ed8e2a5-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.927373 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/837089fe-a05d-42fb-913c-41a52ed8e2a5-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 09:51:59 crc kubenswrapper[4722]: I1002 09:51:59.927390 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nh7m\" (UniqueName: \"kubernetes.io/projected/837089fe-a05d-42fb-913c-41a52ed8e2a5-kube-api-access-7nh7m\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.036127 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-97f9cf4-24pvp" Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.052248 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-69bf8c6599-fz5nx" Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.130261 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jdtp\" (UniqueName: \"kubernetes.io/projected/7cd624d7-27f9-494c-8c61-b21b245f496f-kube-api-access-4jdtp\") pod \"7cd624d7-27f9-494c-8c61-b21b245f496f\" (UID: \"7cd624d7-27f9-494c-8c61-b21b245f496f\") " Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.130757 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cd624d7-27f9-494c-8c61-b21b245f496f-config-data\") pod \"7cd624d7-27f9-494c-8c61-b21b245f496f\" (UID: \"7cd624d7-27f9-494c-8c61-b21b245f496f\") " Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.130801 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b026b07-bc77-4017-a807-4c2f9291f447-config-data-custom\") pod \"8b026b07-bc77-4017-a807-4c2f9291f447\" (UID: \"8b026b07-bc77-4017-a807-4c2f9291f447\") " Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.130905 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b026b07-bc77-4017-a807-4c2f9291f447-logs\") pod \"8b026b07-bc77-4017-a807-4c2f9291f447\" (UID: \"8b026b07-bc77-4017-a807-4c2f9291f447\") " Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.130945 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b026b07-bc77-4017-a807-4c2f9291f447-combined-ca-bundle\") pod \"8b026b07-bc77-4017-a807-4c2f9291f447\" (UID: \"8b026b07-bc77-4017-a807-4c2f9291f447\") " Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.130972 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cd624d7-27f9-494c-8c61-b21b245f496f-logs\") pod \"7cd624d7-27f9-494c-8c61-b21b245f496f\" (UID: \"7cd624d7-27f9-494c-8c61-b21b245f496f\") " Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.131005 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2dmg\" (UniqueName: \"kubernetes.io/projected/8b026b07-bc77-4017-a807-4c2f9291f447-kube-api-access-t2dmg\") pod \"8b026b07-bc77-4017-a807-4c2f9291f447\" (UID: \"8b026b07-bc77-4017-a807-4c2f9291f447\") " Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.131034 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b026b07-bc77-4017-a807-4c2f9291f447-config-data\") pod \"8b026b07-bc77-4017-a807-4c2f9291f447\" (UID: \"8b026b07-bc77-4017-a807-4c2f9291f447\") " Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.131074 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7cd624d7-27f9-494c-8c61-b21b245f496f-config-data-custom\") pod \"7cd624d7-27f9-494c-8c61-b21b245f496f\" (UID: \"7cd624d7-27f9-494c-8c61-b21b245f496f\") " Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.131109 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cd624d7-27f9-494c-8c61-b21b245f496f-combined-ca-bundle\") pod \"7cd624d7-27f9-494c-8c61-b21b245f496f\" (UID: \"7cd624d7-27f9-494c-8c61-b21b245f496f\") " Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.132844 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cd624d7-27f9-494c-8c61-b21b245f496f-logs" (OuterVolumeSpecName: "logs") pod "7cd624d7-27f9-494c-8c61-b21b245f496f" (UID: "7cd624d7-27f9-494c-8c61-b21b245f496f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.132866 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b026b07-bc77-4017-a807-4c2f9291f447-logs" (OuterVolumeSpecName: "logs") pod "8b026b07-bc77-4017-a807-4c2f9291f447" (UID: "8b026b07-bc77-4017-a807-4c2f9291f447"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.133895 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cd624d7-27f9-494c-8c61-b21b245f496f-kube-api-access-4jdtp" (OuterVolumeSpecName: "kube-api-access-4jdtp") pod "7cd624d7-27f9-494c-8c61-b21b245f496f" (UID: "7cd624d7-27f9-494c-8c61-b21b245f496f"). InnerVolumeSpecName "kube-api-access-4jdtp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.134610 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b026b07-bc77-4017-a807-4c2f9291f447-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8b026b07-bc77-4017-a807-4c2f9291f447" (UID: "8b026b07-bc77-4017-a807-4c2f9291f447"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.136077 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cd624d7-27f9-494c-8c61-b21b245f496f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7cd624d7-27f9-494c-8c61-b21b245f496f" (UID: "7cd624d7-27f9-494c-8c61-b21b245f496f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.136108 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b026b07-bc77-4017-a807-4c2f9291f447-kube-api-access-t2dmg" (OuterVolumeSpecName: "kube-api-access-t2dmg") pod "8b026b07-bc77-4017-a807-4c2f9291f447" (UID: "8b026b07-bc77-4017-a807-4c2f9291f447"). InnerVolumeSpecName "kube-api-access-t2dmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.152618 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cd624d7-27f9-494c-8c61-b21b245f496f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7cd624d7-27f9-494c-8c61-b21b245f496f" (UID: "7cd624d7-27f9-494c-8c61-b21b245f496f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.160711 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b026b07-bc77-4017-a807-4c2f9291f447-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b026b07-bc77-4017-a807-4c2f9291f447" (UID: "8b026b07-bc77-4017-a807-4c2f9291f447"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.189479 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cd624d7-27f9-494c-8c61-b21b245f496f-config-data" (OuterVolumeSpecName: "config-data") pod "7cd624d7-27f9-494c-8c61-b21b245f496f" (UID: "7cd624d7-27f9-494c-8c61-b21b245f496f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.196511 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b026b07-bc77-4017-a807-4c2f9291f447-config-data" (OuterVolumeSpecName: "config-data") pod "8b026b07-bc77-4017-a807-4c2f9291f447" (UID: "8b026b07-bc77-4017-a807-4c2f9291f447"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.232706 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b026b07-bc77-4017-a807-4c2f9291f447-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.232744 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cd624d7-27f9-494c-8c61-b21b245f496f-logs\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.232757 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2dmg\" (UniqueName: \"kubernetes.io/projected/8b026b07-bc77-4017-a807-4c2f9291f447-kube-api-access-t2dmg\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.232768 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b026b07-bc77-4017-a807-4c2f9291f447-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.232777 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7cd624d7-27f9-494c-8c61-b21b245f496f-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.232789 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cd624d7-27f9-494c-8c61-b21b245f496f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.232797 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jdtp\" (UniqueName: \"kubernetes.io/projected/7cd624d7-27f9-494c-8c61-b21b245f496f-kube-api-access-4jdtp\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.232806 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cd624d7-27f9-494c-8c61-b21b245f496f-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.232813 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b026b07-bc77-4017-a807-4c2f9291f447-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.232820 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b026b07-bc77-4017-a807-4c2f9291f447-logs\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.718220 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ca9bdee-14e5-440e-9e3e-a976f54c6d75" path="/var/lib/kubelet/pods/6ca9bdee-14e5-440e-9e3e-a976f54c6d75/volumes" Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.753106 4722 generic.go:334] "Generic (PLEG): container finished" podID="7cd624d7-27f9-494c-8c61-b21b245f496f" containerID="68437dea428f2d3603c7b503e46cf7e42bb80b55d07a41e9be623f94329cf6a8" exitCode=1 Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.753180 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-97f9cf4-24pvp" Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.753278 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-97f9cf4-24pvp" event={"ID":"7cd624d7-27f9-494c-8c61-b21b245f496f","Type":"ContainerDied","Data":"68437dea428f2d3603c7b503e46cf7e42bb80b55d07a41e9be623f94329cf6a8"} Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.753351 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-97f9cf4-24pvp" event={"ID":"7cd624d7-27f9-494c-8c61-b21b245f496f","Type":"ContainerDied","Data":"09c0fc3b2eb35da1301df20b45cc87678c307cadbfcd91da1b86f8f315cb31d5"} Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.753376 4722 scope.go:117] "RemoveContainer" containerID="68437dea428f2d3603c7b503e46cf7e42bb80b55d07a41e9be623f94329cf6a8" Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.755852 4722 generic.go:334] "Generic (PLEG): container finished" podID="8b026b07-bc77-4017-a807-4c2f9291f447" containerID="4a5d603d3231c58a45f795e5d4c5258e7462ea5deef2fed5723752b00065f2f8" exitCode=1 Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.755890 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-69bf8c6599-fz5nx" event={"ID":"8b026b07-bc77-4017-a807-4c2f9291f447","Type":"ContainerDied","Data":"4a5d603d3231c58a45f795e5d4c5258e7462ea5deef2fed5723752b00065f2f8"} Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.755922 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-69bf8c6599-fz5nx" event={"ID":"8b026b07-bc77-4017-a807-4c2f9291f447","Type":"ContainerDied","Data":"e54e260dbb7046f394c8e64cc636ce9ee1727328f56f94b931242cddfddec3b7"} Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.755876 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-69bf8c6599-fz5nx" Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.757482 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-7db7564864-mqwgz" Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.783710 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-97f9cf4-24pvp"] Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.788037 4722 scope.go:117] "RemoveContainer" containerID="3a4644b019e699b79b93bbe0f9a39f51515b7a6a996a7734909e27525b001baf" Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.789407 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-97f9cf4-24pvp"] Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.800353 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-worker-69bf8c6599-fz5nx"] Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.810199 4722 scope.go:117] "RemoveContainer" containerID="68437dea428f2d3603c7b503e46cf7e42bb80b55d07a41e9be623f94329cf6a8" Oct 02 09:52:00 crc kubenswrapper[4722]: E1002 09:52:00.812340 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68437dea428f2d3603c7b503e46cf7e42bb80b55d07a41e9be623f94329cf6a8\": container with ID starting with 68437dea428f2d3603c7b503e46cf7e42bb80b55d07a41e9be623f94329cf6a8 not found: ID does not exist" containerID="68437dea428f2d3603c7b503e46cf7e42bb80b55d07a41e9be623f94329cf6a8" Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.812398 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68437dea428f2d3603c7b503e46cf7e42bb80b55d07a41e9be623f94329cf6a8"} err="failed to get container status \"68437dea428f2d3603c7b503e46cf7e42bb80b55d07a41e9be623f94329cf6a8\": rpc error: code = NotFound desc = could not find container \"68437dea428f2d3603c7b503e46cf7e42bb80b55d07a41e9be623f94329cf6a8\": container with ID starting with 68437dea428f2d3603c7b503e46cf7e42bb80b55d07a41e9be623f94329cf6a8 not found: ID does not exist" Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.812438 4722 scope.go:117] "RemoveContainer" containerID="3a4644b019e699b79b93bbe0f9a39f51515b7a6a996a7734909e27525b001baf" Oct 02 09:52:00 crc kubenswrapper[4722]: E1002 09:52:00.814510 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a4644b019e699b79b93bbe0f9a39f51515b7a6a996a7734909e27525b001baf\": container with ID starting with 3a4644b019e699b79b93bbe0f9a39f51515b7a6a996a7734909e27525b001baf not found: ID does not exist" containerID="3a4644b019e699b79b93bbe0f9a39f51515b7a6a996a7734909e27525b001baf" Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.814549 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a4644b019e699b79b93bbe0f9a39f51515b7a6a996a7734909e27525b001baf"} err="failed to get container status \"3a4644b019e699b79b93bbe0f9a39f51515b7a6a996a7734909e27525b001baf\": rpc error: code = NotFound desc = could not find container \"3a4644b019e699b79b93bbe0f9a39f51515b7a6a996a7734909e27525b001baf\": container with ID starting with 3a4644b019e699b79b93bbe0f9a39f51515b7a6a996a7734909e27525b001baf not found: ID does not exist" Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.814573 4722 scope.go:117] "RemoveContainer" containerID="4a5d603d3231c58a45f795e5d4c5258e7462ea5deef2fed5723752b00065f2f8" Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.823236 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-worker-69bf8c6599-fz5nx"] Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.829260 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-7db7564864-mqwgz"] Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.835403 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-7db7564864-mqwgz"] Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.845513 4722 scope.go:117] "RemoveContainer" containerID="74d06a6dcd7c6308970c26eb331d1699511930a2a57bc527e0ebf3b3659de1e4" Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.876683 4722 scope.go:117] "RemoveContainer" containerID="4a5d603d3231c58a45f795e5d4c5258e7462ea5deef2fed5723752b00065f2f8" Oct 02 09:52:00 crc kubenswrapper[4722]: E1002 09:52:00.877703 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a5d603d3231c58a45f795e5d4c5258e7462ea5deef2fed5723752b00065f2f8\": container with ID starting with 4a5d603d3231c58a45f795e5d4c5258e7462ea5deef2fed5723752b00065f2f8 not found: ID does not exist" containerID="4a5d603d3231c58a45f795e5d4c5258e7462ea5deef2fed5723752b00065f2f8" Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.877792 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a5d603d3231c58a45f795e5d4c5258e7462ea5deef2fed5723752b00065f2f8"} err="failed to get container status \"4a5d603d3231c58a45f795e5d4c5258e7462ea5deef2fed5723752b00065f2f8\": rpc error: code = NotFound desc = could not find container \"4a5d603d3231c58a45f795e5d4c5258e7462ea5deef2fed5723752b00065f2f8\": container with ID starting with 4a5d603d3231c58a45f795e5d4c5258e7462ea5deef2fed5723752b00065f2f8 not found: ID does not exist" Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.877873 4722 scope.go:117] "RemoveContainer" containerID="74d06a6dcd7c6308970c26eb331d1699511930a2a57bc527e0ebf3b3659de1e4" Oct 02 09:52:00 crc kubenswrapper[4722]: E1002 09:52:00.878789 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74d06a6dcd7c6308970c26eb331d1699511930a2a57bc527e0ebf3b3659de1e4\": container with ID starting with 74d06a6dcd7c6308970c26eb331d1699511930a2a57bc527e0ebf3b3659de1e4 not found: ID does not exist" containerID="74d06a6dcd7c6308970c26eb331d1699511930a2a57bc527e0ebf3b3659de1e4" Oct 02 09:52:00 crc kubenswrapper[4722]: I1002 09:52:00.878844 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74d06a6dcd7c6308970c26eb331d1699511930a2a57bc527e0ebf3b3659de1e4"} err="failed to get container status \"74d06a6dcd7c6308970c26eb331d1699511930a2a57bc527e0ebf3b3659de1e4\": rpc error: code = NotFound desc = could not find container \"74d06a6dcd7c6308970c26eb331d1699511930a2a57bc527e0ebf3b3659de1e4\": container with ID starting with 74d06a6dcd7c6308970c26eb331d1699511930a2a57bc527e0ebf3b3659de1e4 not found: ID does not exist" Oct 02 09:52:01 crc kubenswrapper[4722]: I1002 09:52:01.027069 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbicanc513-account-delete-ndb4h" Oct 02 09:52:01 crc kubenswrapper[4722]: I1002 09:52:01.147468 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pq2x8\" (UniqueName: \"kubernetes.io/projected/9efa47a4-66d0-4f08-a57f-3bc68b601e0b-kube-api-access-pq2x8\") pod \"9efa47a4-66d0-4f08-a57f-3bc68b601e0b\" (UID: \"9efa47a4-66d0-4f08-a57f-3bc68b601e0b\") " Oct 02 09:52:01 crc kubenswrapper[4722]: I1002 09:52:01.159455 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9efa47a4-66d0-4f08-a57f-3bc68b601e0b-kube-api-access-pq2x8" (OuterVolumeSpecName: "kube-api-access-pq2x8") pod "9efa47a4-66d0-4f08-a57f-3bc68b601e0b" (UID: "9efa47a4-66d0-4f08-a57f-3bc68b601e0b"). InnerVolumeSpecName "kube-api-access-pq2x8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:52:01 crc kubenswrapper[4722]: I1002 09:52:01.250208 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pq2x8\" (UniqueName: \"kubernetes.io/projected/9efa47a4-66d0-4f08-a57f-3bc68b601e0b-kube-api-access-pq2x8\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:01 crc kubenswrapper[4722]: I1002 09:52:01.766874 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbicanc513-account-delete-ndb4h" event={"ID":"9efa47a4-66d0-4f08-a57f-3bc68b601e0b","Type":"ContainerDied","Data":"68f95fa6b9c2b81a6bc27ccfdf7bee821deaf2df6c315d3c53d7bfa65e150edf"} Oct 02 09:52:01 crc kubenswrapper[4722]: I1002 09:52:01.766948 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68f95fa6b9c2b81a6bc27ccfdf7bee821deaf2df6c315d3c53d7bfa65e150edf" Oct 02 09:52:01 crc kubenswrapper[4722]: I1002 09:52:01.766924 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbicanc513-account-delete-ndb4h" Oct 02 09:52:02 crc kubenswrapper[4722]: I1002 09:52:02.718638 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cd624d7-27f9-494c-8c61-b21b245f496f" path="/var/lib/kubelet/pods/7cd624d7-27f9-494c-8c61-b21b245f496f/volumes" Oct 02 09:52:02 crc kubenswrapper[4722]: I1002 09:52:02.719604 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="837089fe-a05d-42fb-913c-41a52ed8e2a5" path="/var/lib/kubelet/pods/837089fe-a05d-42fb-913c-41a52ed8e2a5/volumes" Oct 02 09:52:02 crc kubenswrapper[4722]: I1002 09:52:02.720545 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b026b07-bc77-4017-a807-4c2f9291f447" path="/var/lib/kubelet/pods/8b026b07-bc77-4017-a807-4c2f9291f447/volumes" Oct 02 09:52:02 crc kubenswrapper[4722]: I1002 09:52:02.826859 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-db-create-zzdpc"] Oct 02 09:52:02 crc kubenswrapper[4722]: I1002 09:52:02.835452 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-db-create-zzdpc"] Oct 02 09:52:02 crc kubenswrapper[4722]: I1002 09:52:02.840144 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbicanc513-account-delete-ndb4h"] Oct 02 09:52:02 crc kubenswrapper[4722]: I1002 09:52:02.844785 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbicanc513-account-delete-ndb4h"] Oct 02 09:52:02 crc kubenswrapper[4722]: I1002 09:52:02.849245 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-c513-account-create-fpzlj"] Oct 02 09:52:02 crc kubenswrapper[4722]: I1002 09:52:02.853386 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-c513-account-create-fpzlj"] Oct 02 09:52:04 crc kubenswrapper[4722]: I1002 09:52:04.718279 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0609424e-afb3-4d80-bf7a-bf8a09a0a2e8" path="/var/lib/kubelet/pods/0609424e-afb3-4d80-bf7a-bf8a09a0a2e8/volumes" Oct 02 09:52:04 crc kubenswrapper[4722]: I1002 09:52:04.719236 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9efa47a4-66d0-4f08-a57f-3bc68b601e0b" path="/var/lib/kubelet/pods/9efa47a4-66d0-4f08-a57f-3bc68b601e0b/volumes" Oct 02 09:52:04 crc kubenswrapper[4722]: I1002 09:52:04.719709 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2358339-8968-42f4-9f6d-05933fb3e2f7" path="/var/lib/kubelet/pods/a2358339-8968-42f4-9f6d-05933fb3e2f7/volumes" Oct 02 09:52:07 crc kubenswrapper[4722]: I1002 09:52:07.517309 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/keystone-db-sync-dkjtj"] Oct 02 09:52:07 crc kubenswrapper[4722]: I1002 09:52:07.529623 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/keystone-db-sync-dkjtj"] Oct 02 09:52:07 crc kubenswrapper[4722]: I1002 09:52:07.534838 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/keystone-bootstrap-chdcv"] Oct 02 09:52:07 crc kubenswrapper[4722]: I1002 09:52:07.540378 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/keystone-bootstrap-chdcv"] Oct 02 09:52:07 crc kubenswrapper[4722]: I1002 09:52:07.544643 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/keystone-54b885497-t8c9k"] Oct 02 09:52:07 crc kubenswrapper[4722]: I1002 09:52:07.544872 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/keystone-54b885497-t8c9k" podUID="8198b0a7-cf5a-41f0-a528-0f3833ff97a6" containerName="keystone-api" containerID="cri-o://e329ca8245e6f8ad5d6e82c79296035a0ba51fcaabed81266e371466d8d27b9f" gracePeriod=30 Oct 02 09:52:07 crc kubenswrapper[4722]: I1002 09:52:07.578845 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/keystone9777-account-delete-l48jd"] Oct 02 09:52:07 crc kubenswrapper[4722]: E1002 09:52:07.579335 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b026b07-bc77-4017-a807-4c2f9291f447" containerName="barbican-worker-log" Oct 02 09:52:07 crc kubenswrapper[4722]: I1002 09:52:07.579355 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b026b07-bc77-4017-a807-4c2f9291f447" containerName="barbican-worker-log" Oct 02 09:52:07 crc kubenswrapper[4722]: E1002 09:52:07.579371 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b026b07-bc77-4017-a807-4c2f9291f447" containerName="barbican-worker" Oct 02 09:52:07 crc kubenswrapper[4722]: I1002 09:52:07.579379 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b026b07-bc77-4017-a807-4c2f9291f447" containerName="barbican-worker" Oct 02 09:52:07 crc kubenswrapper[4722]: E1002 09:52:07.579391 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cd624d7-27f9-494c-8c61-b21b245f496f" containerName="barbican-keystone-listener-log" Oct 02 09:52:07 crc kubenswrapper[4722]: I1002 09:52:07.579399 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cd624d7-27f9-494c-8c61-b21b245f496f" containerName="barbican-keystone-listener-log" Oct 02 09:52:07 crc kubenswrapper[4722]: E1002 09:52:07.579423 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="837089fe-a05d-42fb-913c-41a52ed8e2a5" containerName="barbican-keystone-listener-log" Oct 02 09:52:07 crc kubenswrapper[4722]: I1002 09:52:07.579432 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="837089fe-a05d-42fb-913c-41a52ed8e2a5" containerName="barbican-keystone-listener-log" Oct 02 09:52:07 crc kubenswrapper[4722]: E1002 09:52:07.579445 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ca9bdee-14e5-440e-9e3e-a976f54c6d75" containerName="barbican-api-log" Oct 02 09:52:07 crc kubenswrapper[4722]: I1002 09:52:07.579454 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ca9bdee-14e5-440e-9e3e-a976f54c6d75" containerName="barbican-api-log" Oct 02 09:52:07 crc kubenswrapper[4722]: E1002 09:52:07.579469 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="837089fe-a05d-42fb-913c-41a52ed8e2a5" containerName="barbican-keystone-listener" Oct 02 09:52:07 crc kubenswrapper[4722]: I1002 09:52:07.579476 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="837089fe-a05d-42fb-913c-41a52ed8e2a5" containerName="barbican-keystone-listener" Oct 02 09:52:07 crc kubenswrapper[4722]: E1002 09:52:07.579485 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cd624d7-27f9-494c-8c61-b21b245f496f" containerName="barbican-keystone-listener" Oct 02 09:52:07 crc kubenswrapper[4722]: I1002 09:52:07.579492 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cd624d7-27f9-494c-8c61-b21b245f496f" containerName="barbican-keystone-listener" Oct 02 09:52:07 crc kubenswrapper[4722]: E1002 09:52:07.579506 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ca9bdee-14e5-440e-9e3e-a976f54c6d75" containerName="barbican-api" Oct 02 09:52:07 crc kubenswrapper[4722]: I1002 09:52:07.579514 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ca9bdee-14e5-440e-9e3e-a976f54c6d75" containerName="barbican-api" Oct 02 09:52:07 crc kubenswrapper[4722]: E1002 09:52:07.579528 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9efa47a4-66d0-4f08-a57f-3bc68b601e0b" containerName="mariadb-account-delete" Oct 02 09:52:07 crc kubenswrapper[4722]: I1002 09:52:07.579536 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9efa47a4-66d0-4f08-a57f-3bc68b601e0b" containerName="mariadb-account-delete" Oct 02 09:52:07 crc kubenswrapper[4722]: I1002 09:52:07.579668 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="837089fe-a05d-42fb-913c-41a52ed8e2a5" containerName="barbican-keystone-listener" Oct 02 09:52:07 crc kubenswrapper[4722]: I1002 09:52:07.579679 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b026b07-bc77-4017-a807-4c2f9291f447" containerName="barbican-worker-log" Oct 02 09:52:07 crc kubenswrapper[4722]: I1002 09:52:07.579685 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cd624d7-27f9-494c-8c61-b21b245f496f" containerName="barbican-keystone-listener" Oct 02 09:52:07 crc kubenswrapper[4722]: I1002 09:52:07.579694 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b026b07-bc77-4017-a807-4c2f9291f447" containerName="barbican-worker" Oct 02 09:52:07 crc kubenswrapper[4722]: I1002 09:52:07.579703 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ca9bdee-14e5-440e-9e3e-a976f54c6d75" containerName="barbican-api" Oct 02 09:52:07 crc kubenswrapper[4722]: I1002 09:52:07.579710 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cd624d7-27f9-494c-8c61-b21b245f496f" containerName="barbican-keystone-listener-log" Oct 02 09:52:07 crc kubenswrapper[4722]: I1002 09:52:07.579716 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="837089fe-a05d-42fb-913c-41a52ed8e2a5" containerName="barbican-keystone-listener-log" Oct 02 09:52:07 crc kubenswrapper[4722]: I1002 09:52:07.579725 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ca9bdee-14e5-440e-9e3e-a976f54c6d75" containerName="barbican-api-log" Oct 02 09:52:07 crc kubenswrapper[4722]: I1002 09:52:07.579732 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="9efa47a4-66d0-4f08-a57f-3bc68b601e0b" containerName="mariadb-account-delete" Oct 02 09:52:07 crc kubenswrapper[4722]: I1002 09:52:07.580241 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone9777-account-delete-l48jd" Oct 02 09:52:07 crc kubenswrapper[4722]: I1002 09:52:07.589936 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone9777-account-delete-l48jd"] Oct 02 09:52:07 crc kubenswrapper[4722]: I1002 09:52:07.626526 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/keystone-db-create-llpsl"] Oct 02 09:52:07 crc kubenswrapper[4722]: I1002 09:52:07.629807 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/keystone9777-account-delete-l48jd"] Oct 02 09:52:07 crc kubenswrapper[4722]: E1002 09:52:07.630263 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-74qcj], unattached volumes=[], failed to process volumes=[]: context canceled" pod="barbican-kuttl-tests/keystone9777-account-delete-l48jd" podUID="0c078473-c6c2-4268-b51f-de10a322f6bb" Oct 02 09:52:07 crc kubenswrapper[4722]: I1002 09:52:07.633897 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/keystone-9777-account-create-zbds6"] Oct 02 09:52:07 crc kubenswrapper[4722]: I1002 09:52:07.638059 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/keystone-db-create-llpsl"] Oct 02 09:52:07 crc kubenswrapper[4722]: I1002 09:52:07.641576 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/keystone-9777-account-create-zbds6"] Oct 02 09:52:07 crc kubenswrapper[4722]: I1002 09:52:07.647593 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74qcj\" (UniqueName: \"kubernetes.io/projected/0c078473-c6c2-4268-b51f-de10a322f6bb-kube-api-access-74qcj\") pod \"keystone9777-account-delete-l48jd\" (UID: \"0c078473-c6c2-4268-b51f-de10a322f6bb\") " pod="barbican-kuttl-tests/keystone9777-account-delete-l48jd" Oct 02 09:52:07 crc kubenswrapper[4722]: I1002 09:52:07.749516 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74qcj\" (UniqueName: \"kubernetes.io/projected/0c078473-c6c2-4268-b51f-de10a322f6bb-kube-api-access-74qcj\") pod \"keystone9777-account-delete-l48jd\" (UID: \"0c078473-c6c2-4268-b51f-de10a322f6bb\") " pod="barbican-kuttl-tests/keystone9777-account-delete-l48jd" Oct 02 09:52:07 crc kubenswrapper[4722]: I1002 09:52:07.768381 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74qcj\" (UniqueName: \"kubernetes.io/projected/0c078473-c6c2-4268-b51f-de10a322f6bb-kube-api-access-74qcj\") pod \"keystone9777-account-delete-l48jd\" (UID: \"0c078473-c6c2-4268-b51f-de10a322f6bb\") " pod="barbican-kuttl-tests/keystone9777-account-delete-l48jd" Oct 02 09:52:07 crc kubenswrapper[4722]: I1002 09:52:07.809762 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone9777-account-delete-l48jd" Oct 02 09:52:07 crc kubenswrapper[4722]: I1002 09:52:07.820740 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone9777-account-delete-l48jd" Oct 02 09:52:07 crc kubenswrapper[4722]: I1002 09:52:07.851027 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74qcj\" (UniqueName: \"kubernetes.io/projected/0c078473-c6c2-4268-b51f-de10a322f6bb-kube-api-access-74qcj\") pod \"0c078473-c6c2-4268-b51f-de10a322f6bb\" (UID: \"0c078473-c6c2-4268-b51f-de10a322f6bb\") " Oct 02 09:52:07 crc kubenswrapper[4722]: I1002 09:52:07.855516 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c078473-c6c2-4268-b51f-de10a322f6bb-kube-api-access-74qcj" (OuterVolumeSpecName: "kube-api-access-74qcj") pod "0c078473-c6c2-4268-b51f-de10a322f6bb" (UID: "0c078473-c6c2-4268-b51f-de10a322f6bb"). InnerVolumeSpecName "kube-api-access-74qcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:52:07 crc kubenswrapper[4722]: I1002 09:52:07.953479 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74qcj\" (UniqueName: \"kubernetes.io/projected/0c078473-c6c2-4268-b51f-de10a322f6bb-kube-api-access-74qcj\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:08 crc kubenswrapper[4722]: I1002 09:52:08.450684 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/openstack-galera-1"] Oct 02 09:52:08 crc kubenswrapper[4722]: I1002 09:52:08.462125 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/openstack-galera-0"] Oct 02 09:52:08 crc kubenswrapper[4722]: I1002 09:52:08.466060 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/openstack-galera-2"] Oct 02 09:52:08 crc kubenswrapper[4722]: I1002 09:52:08.592381 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/openstack-galera-2" podUID="7cb4f813-1f97-4272-943d-05fa71c599dc" containerName="galera" containerID="cri-o://e85c3885111193ba4c7cee2ba1ae687547e4e42c47914248d94caff929f43be1" gracePeriod=30 Oct 02 09:52:08 crc kubenswrapper[4722]: I1002 09:52:08.720277 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4715d909-cf7b-4a50-add4-38e3482e6a5a" path="/var/lib/kubelet/pods/4715d909-cf7b-4a50-add4-38e3482e6a5a/volumes" Oct 02 09:52:08 crc kubenswrapper[4722]: I1002 09:52:08.720995 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a314cbe4-1509-4bb3-a6ab-45ec40fa632d" path="/var/lib/kubelet/pods/a314cbe4-1509-4bb3-a6ab-45ec40fa632d/volumes" Oct 02 09:52:08 crc kubenswrapper[4722]: I1002 09:52:08.721494 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3df4880-3d2d-429c-87b9-cad9afbbb682" path="/var/lib/kubelet/pods/b3df4880-3d2d-429c-87b9-cad9afbbb682/volumes" Oct 02 09:52:08 crc kubenswrapper[4722]: I1002 09:52:08.722141 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6107a52-6863-4d02-a3ec-dc6161bfac26" path="/var/lib/kubelet/pods/c6107a52-6863-4d02-a3ec-dc6161bfac26/volumes" Oct 02 09:52:09 crc kubenswrapper[4722]: I1002 09:52:08.816881 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone9777-account-delete-l48jd" Oct 02 09:52:09 crc kubenswrapper[4722]: I1002 09:52:08.872965 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/keystone9777-account-delete-l48jd"] Oct 02 09:52:09 crc kubenswrapper[4722]: I1002 09:52:08.878625 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/keystone9777-account-delete-l48jd"] Oct 02 09:52:09 crc kubenswrapper[4722]: I1002 09:52:09.096534 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/memcached-0"] Oct 02 09:52:09 crc kubenswrapper[4722]: I1002 09:52:09.096863 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/memcached-0" podUID="b4d42537-11aa-4a12-8137-feca0a8d889b" containerName="memcached" containerID="cri-o://86b6af3ea1cebe16e80a7c7f66941e145bbcd18f9388440526dd59ea8766e63f" gracePeriod=30 Oct 02 09:52:09 crc kubenswrapper[4722]: E1002 09:52:09.212798 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7cb4f813_1f97_4272_943d_05fa71c599dc.slice/crio-e85c3885111193ba4c7cee2ba1ae687547e4e42c47914248d94caff929f43be1.scope\": RecentStats: unable to find data in memory cache]" Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:09.491462 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/rabbitmq-server-0"] Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:09.494842 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-2" Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:09.587669 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6gm2\" (UniqueName: \"kubernetes.io/projected/7cb4f813-1f97-4272-943d-05fa71c599dc-kube-api-access-z6gm2\") pod \"7cb4f813-1f97-4272-943d-05fa71c599dc\" (UID: \"7cb4f813-1f97-4272-943d-05fa71c599dc\") " Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:09.587724 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7cb4f813-1f97-4272-943d-05fa71c599dc-config-data-generated\") pod \"7cb4f813-1f97-4272-943d-05fa71c599dc\" (UID: \"7cb4f813-1f97-4272-943d-05fa71c599dc\") " Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:09.587825 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7cb4f813-1f97-4272-943d-05fa71c599dc-config-data-default\") pod \"7cb4f813-1f97-4272-943d-05fa71c599dc\" (UID: \"7cb4f813-1f97-4272-943d-05fa71c599dc\") " Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:09.587859 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/7cb4f813-1f97-4272-943d-05fa71c599dc-secrets\") pod \"7cb4f813-1f97-4272-943d-05fa71c599dc\" (UID: \"7cb4f813-1f97-4272-943d-05fa71c599dc\") " Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:09.587881 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7cb4f813-1f97-4272-943d-05fa71c599dc-operator-scripts\") pod \"7cb4f813-1f97-4272-943d-05fa71c599dc\" (UID: \"7cb4f813-1f97-4272-943d-05fa71c599dc\") " Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:09.587903 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"7cb4f813-1f97-4272-943d-05fa71c599dc\" (UID: \"7cb4f813-1f97-4272-943d-05fa71c599dc\") " Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:09.587941 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7cb4f813-1f97-4272-943d-05fa71c599dc-kolla-config\") pod \"7cb4f813-1f97-4272-943d-05fa71c599dc\" (UID: \"7cb4f813-1f97-4272-943d-05fa71c599dc\") " Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:09.588440 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cb4f813-1f97-4272-943d-05fa71c599dc-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "7cb4f813-1f97-4272-943d-05fa71c599dc" (UID: "7cb4f813-1f97-4272-943d-05fa71c599dc"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:09.589290 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cb4f813-1f97-4272-943d-05fa71c599dc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7cb4f813-1f97-4272-943d-05fa71c599dc" (UID: "7cb4f813-1f97-4272-943d-05fa71c599dc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:09.589387 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cb4f813-1f97-4272-943d-05fa71c599dc-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "7cb4f813-1f97-4272-943d-05fa71c599dc" (UID: "7cb4f813-1f97-4272-943d-05fa71c599dc"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:09.589589 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cb4f813-1f97-4272-943d-05fa71c599dc-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "7cb4f813-1f97-4272-943d-05fa71c599dc" (UID: "7cb4f813-1f97-4272-943d-05fa71c599dc"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:09.593899 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cb4f813-1f97-4272-943d-05fa71c599dc-kube-api-access-z6gm2" (OuterVolumeSpecName: "kube-api-access-z6gm2") pod "7cb4f813-1f97-4272-943d-05fa71c599dc" (UID: "7cb4f813-1f97-4272-943d-05fa71c599dc"). InnerVolumeSpecName "kube-api-access-z6gm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:09.594055 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cb4f813-1f97-4272-943d-05fa71c599dc-secrets" (OuterVolumeSpecName: "secrets") pod "7cb4f813-1f97-4272-943d-05fa71c599dc" (UID: "7cb4f813-1f97-4272-943d-05fa71c599dc"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:09.598401 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "mysql-db") pod "7cb4f813-1f97-4272-943d-05fa71c599dc" (UID: "7cb4f813-1f97-4272-943d-05fa71c599dc"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:09.689177 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6gm2\" (UniqueName: \"kubernetes.io/projected/7cb4f813-1f97-4272-943d-05fa71c599dc-kube-api-access-z6gm2\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:09.689209 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7cb4f813-1f97-4272-943d-05fa71c599dc-config-data-generated\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:09.689218 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7cb4f813-1f97-4272-943d-05fa71c599dc-config-data-default\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:09.689226 4722 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/7cb4f813-1f97-4272-943d-05fa71c599dc-secrets\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:09.689237 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7cb4f813-1f97-4272-943d-05fa71c599dc-operator-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:09.689269 4722 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:09.689280 4722 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7cb4f813-1f97-4272-943d-05fa71c599dc-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:09.701115 4722 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:09.794871 4722 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:09.825638 4722 generic.go:334] "Generic (PLEG): container finished" podID="7cb4f813-1f97-4272-943d-05fa71c599dc" containerID="e85c3885111193ba4c7cee2ba1ae687547e4e42c47914248d94caff929f43be1" exitCode=0 Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:09.825684 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-2" event={"ID":"7cb4f813-1f97-4272-943d-05fa71c599dc","Type":"ContainerDied","Data":"e85c3885111193ba4c7cee2ba1ae687547e4e42c47914248d94caff929f43be1"} Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:09.825722 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-2" event={"ID":"7cb4f813-1f97-4272-943d-05fa71c599dc","Type":"ContainerDied","Data":"b423cb8a141a570a472bfc35562351bd1d37f95bb306a5ddce57e793b4f52835"} Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:09.825747 4722 scope.go:117] "RemoveContainer" containerID="e85c3885111193ba4c7cee2ba1ae687547e4e42c47914248d94caff929f43be1" Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:09.825752 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-2" Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:09.867147 4722 scope.go:117] "RemoveContainer" containerID="f8f3b68509ddf3b0a710af7d61652e8bda12426666c8786f947bc643326fa9eb" Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:09.883510 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/rabbitmq-server-0"] Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:09.889965 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/openstack-galera-2"] Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:09.894661 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/openstack-galera-2"] Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:09.911953 4722 scope.go:117] "RemoveContainer" containerID="e85c3885111193ba4c7cee2ba1ae687547e4e42c47914248d94caff929f43be1" Oct 02 09:52:10 crc kubenswrapper[4722]: E1002 09:52:09.912373 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e85c3885111193ba4c7cee2ba1ae687547e4e42c47914248d94caff929f43be1\": container with ID starting with e85c3885111193ba4c7cee2ba1ae687547e4e42c47914248d94caff929f43be1 not found: ID does not exist" containerID="e85c3885111193ba4c7cee2ba1ae687547e4e42c47914248d94caff929f43be1" Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:09.912402 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e85c3885111193ba4c7cee2ba1ae687547e4e42c47914248d94caff929f43be1"} err="failed to get container status \"e85c3885111193ba4c7cee2ba1ae687547e4e42c47914248d94caff929f43be1\": rpc error: code = NotFound desc = could not find container \"e85c3885111193ba4c7cee2ba1ae687547e4e42c47914248d94caff929f43be1\": container with ID starting with e85c3885111193ba4c7cee2ba1ae687547e4e42c47914248d94caff929f43be1 not found: ID does not exist" Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:09.912427 4722 scope.go:117] "RemoveContainer" containerID="f8f3b68509ddf3b0a710af7d61652e8bda12426666c8786f947bc643326fa9eb" Oct 02 09:52:10 crc kubenswrapper[4722]: E1002 09:52:09.912804 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8f3b68509ddf3b0a710af7d61652e8bda12426666c8786f947bc643326fa9eb\": container with ID starting with f8f3b68509ddf3b0a710af7d61652e8bda12426666c8786f947bc643326fa9eb not found: ID does not exist" containerID="f8f3b68509ddf3b0a710af7d61652e8bda12426666c8786f947bc643326fa9eb" Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:09.912845 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8f3b68509ddf3b0a710af7d61652e8bda12426666c8786f947bc643326fa9eb"} err="failed to get container status \"f8f3b68509ddf3b0a710af7d61652e8bda12426666c8786f947bc643326fa9eb\": rpc error: code = NotFound desc = could not find container \"f8f3b68509ddf3b0a710af7d61652e8bda12426666c8786f947bc643326fa9eb\": container with ID starting with f8f3b68509ddf3b0a710af7d61652e8bda12426666c8786f947bc643326fa9eb not found: ID does not exist" Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:09.931367 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/rabbitmq-server-0" podUID="81c18fc8-43c0-4993-b297-c158ba94f076" containerName="rabbitmq" containerID="cri-o://f933ea6cacf864c31c7efc116b8e3e8a82c82fffa11d020ef9b1c9222164e3fb" gracePeriod=604800 Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:10.696377 4722 patch_prober.go:28] interesting pod/machine-config-daemon-q6h76 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:10.696850 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" podUID="f662826c-e772-4f56-a85f-83971aaef356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:10.698423 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/openstack-galera-1" podUID="4383f5f8-eb32-4c6b-a3a5-0943a567e856" containerName="galera" containerID="cri-o://2230e2db400f30c4e67d870f42bef7d57c323d5b014b2863ebfd568758552b96" gracePeriod=28 Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:10.721719 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c078473-c6c2-4268-b51f-de10a322f6bb" path="/var/lib/kubelet/pods/0c078473-c6c2-4268-b51f-de10a322f6bb/volumes" Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:10.722511 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cb4f813-1f97-4272-943d-05fa71c599dc" path="/var/lib/kubelet/pods/7cb4f813-1f97-4272-943d-05fa71c599dc/volumes" Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:10.905816 4722 generic.go:334] "Generic (PLEG): container finished" podID="8198b0a7-cf5a-41f0-a528-0f3833ff97a6" containerID="e329ca8245e6f8ad5d6e82c79296035a0ba51fcaabed81266e371466d8d27b9f" exitCode=0 Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:10.905903 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-54b885497-t8c9k" event={"ID":"8198b0a7-cf5a-41f0-a528-0f3833ff97a6","Type":"ContainerDied","Data":"e329ca8245e6f8ad5d6e82c79296035a0ba51fcaabed81266e371466d8d27b9f"} Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:10.944765 4722 generic.go:334] "Generic (PLEG): container finished" podID="b4d42537-11aa-4a12-8137-feca0a8d889b" containerID="86b6af3ea1cebe16e80a7c7f66941e145bbcd18f9388440526dd59ea8766e63f" exitCode=0 Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:10.944807 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/memcached-0" event={"ID":"b4d42537-11aa-4a12-8137-feca0a8d889b","Type":"ContainerDied","Data":"86b6af3ea1cebe16e80a7c7f66941e145bbcd18f9388440526dd59ea8766e63f"} Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:10.972919 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7dbf94768f-lxtzw"] Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:10.973164 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/barbican-operator-controller-manager-7dbf94768f-lxtzw" podUID="ff0e8efc-3b56-4f5e-9944-9e34a644c05e" containerName="manager" containerID="cri-o://32008f8b3c1c62a4235f2c06d26d4707b184904400b1763df6f2f6c48bd590d9" gracePeriod=10 Oct 02 09:52:10 crc kubenswrapper[4722]: I1002 09:52:10.973707 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/barbican-operator-controller-manager-7dbf94768f-lxtzw" podUID="ff0e8efc-3b56-4f5e-9944-9e34a644c05e" containerName="kube-rbac-proxy" containerID="cri-o://dc6e86214e0e09921250cd96ef06a234f6e593f15fc9d4c3d81ec4ccc85b7cef" gracePeriod=10 Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.195850 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/barbican-operator-index-fsxkj"] Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.196482 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/barbican-operator-index-fsxkj" podUID="e09d4bc8-436e-4faa-9416-3d76a569c97e" containerName="registry-server" containerID="cri-o://e8f97d4cfb43093b0435d77d81e8da482d512dd24057fff7dfe5ca7914c33b82" gracePeriod=30 Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.260444 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/8335c94caa113f9a451d061ea94690b3e9b02b7104da5fa6096e5e2731pbq69"] Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.277304 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/8335c94caa113f9a451d061ea94690b3e9b02b7104da5fa6096e5e2731pbq69"] Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.301085 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-54b885497-t8c9k" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.332302 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8198b0a7-cf5a-41f0-a528-0f3833ff97a6-fernet-keys\") pod \"8198b0a7-cf5a-41f0-a528-0f3833ff97a6\" (UID: \"8198b0a7-cf5a-41f0-a528-0f3833ff97a6\") " Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.332376 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8198b0a7-cf5a-41f0-a528-0f3833ff97a6-config-data\") pod \"8198b0a7-cf5a-41f0-a528-0f3833ff97a6\" (UID: \"8198b0a7-cf5a-41f0-a528-0f3833ff97a6\") " Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.332392 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8198b0a7-cf5a-41f0-a528-0f3833ff97a6-scripts\") pod \"8198b0a7-cf5a-41f0-a528-0f3833ff97a6\" (UID: \"8198b0a7-cf5a-41f0-a528-0f3833ff97a6\") " Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.332418 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8198b0a7-cf5a-41f0-a528-0f3833ff97a6-credential-keys\") pod \"8198b0a7-cf5a-41f0-a528-0f3833ff97a6\" (UID: \"8198b0a7-cf5a-41f0-a528-0f3833ff97a6\") " Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.332449 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2btq\" (UniqueName: \"kubernetes.io/projected/8198b0a7-cf5a-41f0-a528-0f3833ff97a6-kube-api-access-z2btq\") pod \"8198b0a7-cf5a-41f0-a528-0f3833ff97a6\" (UID: \"8198b0a7-cf5a-41f0-a528-0f3833ff97a6\") " Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.349551 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8198b0a7-cf5a-41f0-a528-0f3833ff97a6-scripts" (OuterVolumeSpecName: "scripts") pod "8198b0a7-cf5a-41f0-a528-0f3833ff97a6" (UID: "8198b0a7-cf5a-41f0-a528-0f3833ff97a6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.349761 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8198b0a7-cf5a-41f0-a528-0f3833ff97a6-kube-api-access-z2btq" (OuterVolumeSpecName: "kube-api-access-z2btq") pod "8198b0a7-cf5a-41f0-a528-0f3833ff97a6" (UID: "8198b0a7-cf5a-41f0-a528-0f3833ff97a6"). InnerVolumeSpecName "kube-api-access-z2btq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.351419 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8198b0a7-cf5a-41f0-a528-0f3833ff97a6-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "8198b0a7-cf5a-41f0-a528-0f3833ff97a6" (UID: "8198b0a7-cf5a-41f0-a528-0f3833ff97a6"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.357722 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8198b0a7-cf5a-41f0-a528-0f3833ff97a6-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8198b0a7-cf5a-41f0-a528-0f3833ff97a6" (UID: "8198b0a7-cf5a-41f0-a528-0f3833ff97a6"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.398403 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8198b0a7-cf5a-41f0-a528-0f3833ff97a6-config-data" (OuterVolumeSpecName: "config-data") pod "8198b0a7-cf5a-41f0-a528-0f3833ff97a6" (UID: "8198b0a7-cf5a-41f0-a528-0f3833ff97a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.436516 4722 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8198b0a7-cf5a-41f0-a528-0f3833ff97a6-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.436547 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8198b0a7-cf5a-41f0-a528-0f3833ff97a6-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.436556 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8198b0a7-cf5a-41f0-a528-0f3833ff97a6-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.436564 4722 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8198b0a7-cf5a-41f0-a528-0f3833ff97a6-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.436575 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2btq\" (UniqueName: \"kubernetes.io/projected/8198b0a7-cf5a-41f0-a528-0f3833ff97a6-kube-api-access-z2btq\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.446788 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/memcached-0" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.538463 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b4d42537-11aa-4a12-8137-feca0a8d889b-config-data\") pod \"b4d42537-11aa-4a12-8137-feca0a8d889b\" (UID: \"b4d42537-11aa-4a12-8137-feca0a8d889b\") " Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.538574 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b4d42537-11aa-4a12-8137-feca0a8d889b-kolla-config\") pod \"b4d42537-11aa-4a12-8137-feca0a8d889b\" (UID: \"b4d42537-11aa-4a12-8137-feca0a8d889b\") " Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.538681 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2z69\" (UniqueName: \"kubernetes.io/projected/b4d42537-11aa-4a12-8137-feca0a8d889b-kube-api-access-t2z69\") pod \"b4d42537-11aa-4a12-8137-feca0a8d889b\" (UID: \"b4d42537-11aa-4a12-8137-feca0a8d889b\") " Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.539493 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4d42537-11aa-4a12-8137-feca0a8d889b-config-data" (OuterVolumeSpecName: "config-data") pod "b4d42537-11aa-4a12-8137-feca0a8d889b" (UID: "b4d42537-11aa-4a12-8137-feca0a8d889b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.540629 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4d42537-11aa-4a12-8137-feca0a8d889b-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "b4d42537-11aa-4a12-8137-feca0a8d889b" (UID: "b4d42537-11aa-4a12-8137-feca0a8d889b"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.543385 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4d42537-11aa-4a12-8137-feca0a8d889b-kube-api-access-t2z69" (OuterVolumeSpecName: "kube-api-access-t2z69") pod "b4d42537-11aa-4a12-8137-feca0a8d889b" (UID: "b4d42537-11aa-4a12-8137-feca0a8d889b"). InnerVolumeSpecName "kube-api-access-t2z69". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.574488 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7dbf94768f-lxtzw" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.640085 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ff0e8efc-3b56-4f5e-9944-9e34a644c05e-apiservice-cert\") pod \"ff0e8efc-3b56-4f5e-9944-9e34a644c05e\" (UID: \"ff0e8efc-3b56-4f5e-9944-9e34a644c05e\") " Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.640146 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8z7jn\" (UniqueName: \"kubernetes.io/projected/ff0e8efc-3b56-4f5e-9944-9e34a644c05e-kube-api-access-8z7jn\") pod \"ff0e8efc-3b56-4f5e-9944-9e34a644c05e\" (UID: \"ff0e8efc-3b56-4f5e-9944-9e34a644c05e\") " Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.640191 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ff0e8efc-3b56-4f5e-9944-9e34a644c05e-webhook-cert\") pod \"ff0e8efc-3b56-4f5e-9944-9e34a644c05e\" (UID: \"ff0e8efc-3b56-4f5e-9944-9e34a644c05e\") " Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.641342 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b4d42537-11aa-4a12-8137-feca0a8d889b-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.641369 4722 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b4d42537-11aa-4a12-8137-feca0a8d889b-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.641382 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2z69\" (UniqueName: \"kubernetes.io/projected/b4d42537-11aa-4a12-8137-feca0a8d889b-kube-api-access-t2z69\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.646261 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff0e8efc-3b56-4f5e-9944-9e34a644c05e-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "ff0e8efc-3b56-4f5e-9944-9e34a644c05e" (UID: "ff0e8efc-3b56-4f5e-9944-9e34a644c05e"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.646315 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff0e8efc-3b56-4f5e-9944-9e34a644c05e-kube-api-access-8z7jn" (OuterVolumeSpecName: "kube-api-access-8z7jn") pod "ff0e8efc-3b56-4f5e-9944-9e34a644c05e" (UID: "ff0e8efc-3b56-4f5e-9944-9e34a644c05e"). InnerVolumeSpecName "kube-api-access-8z7jn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.651630 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff0e8efc-3b56-4f5e-9944-9e34a644c05e-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "ff0e8efc-3b56-4f5e-9944-9e34a644c05e" (UID: "ff0e8efc-3b56-4f5e-9944-9e34a644c05e"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.723165 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-fsxkj" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.753699 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8z7jn\" (UniqueName: \"kubernetes.io/projected/ff0e8efc-3b56-4f5e-9944-9e34a644c05e-kube-api-access-8z7jn\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.753728 4722 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ff0e8efc-3b56-4f5e-9944-9e34a644c05e-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.753739 4722 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ff0e8efc-3b56-4f5e-9944-9e34a644c05e-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.776830 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/rabbitmq-server-0" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.854598 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/81c18fc8-43c0-4993-b297-c158ba94f076-rabbitmq-plugins\") pod \"81c18fc8-43c0-4993-b297-c158ba94f076\" (UID: \"81c18fc8-43c0-4993-b297-c158ba94f076\") " Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.854664 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/81c18fc8-43c0-4993-b297-c158ba94f076-rabbitmq-erlang-cookie\") pod \"81c18fc8-43c0-4993-b297-c158ba94f076\" (UID: \"81c18fc8-43c0-4993-b297-c158ba94f076\") " Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.854698 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/81c18fc8-43c0-4993-b297-c158ba94f076-rabbitmq-confd\") pod \"81c18fc8-43c0-4993-b297-c158ba94f076\" (UID: \"81c18fc8-43c0-4993-b297-c158ba94f076\") " Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.854743 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cf9cg\" (UniqueName: \"kubernetes.io/projected/81c18fc8-43c0-4993-b297-c158ba94f076-kube-api-access-cf9cg\") pod \"81c18fc8-43c0-4993-b297-c158ba94f076\" (UID: \"81c18fc8-43c0-4993-b297-c158ba94f076\") " Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.854783 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/81c18fc8-43c0-4993-b297-c158ba94f076-erlang-cookie-secret\") pod \"81c18fc8-43c0-4993-b297-c158ba94f076\" (UID: \"81c18fc8-43c0-4993-b297-c158ba94f076\") " Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.854816 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/81c18fc8-43c0-4993-b297-c158ba94f076-pod-info\") pod \"81c18fc8-43c0-4993-b297-c158ba94f076\" (UID: \"81c18fc8-43c0-4993-b297-c158ba94f076\") " Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.854848 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/81c18fc8-43c0-4993-b297-c158ba94f076-plugins-conf\") pod \"81c18fc8-43c0-4993-b297-c158ba94f076\" (UID: \"81c18fc8-43c0-4993-b297-c158ba94f076\") " Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.854910 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slsd4\" (UniqueName: \"kubernetes.io/projected/e09d4bc8-436e-4faa-9416-3d76a569c97e-kube-api-access-slsd4\") pod \"e09d4bc8-436e-4faa-9416-3d76a569c97e\" (UID: \"e09d4bc8-436e-4faa-9416-3d76a569c97e\") " Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.855081 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8640b5e6-1bf2-4f88-af7e-f86af06afa4d\") pod \"81c18fc8-43c0-4993-b297-c158ba94f076\" (UID: \"81c18fc8-43c0-4993-b297-c158ba94f076\") " Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.856040 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81c18fc8-43c0-4993-b297-c158ba94f076-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "81c18fc8-43c0-4993-b297-c158ba94f076" (UID: "81c18fc8-43c0-4993-b297-c158ba94f076"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.856105 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81c18fc8-43c0-4993-b297-c158ba94f076-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "81c18fc8-43c0-4993-b297-c158ba94f076" (UID: "81c18fc8-43c0-4993-b297-c158ba94f076"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.857698 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81c18fc8-43c0-4993-b297-c158ba94f076-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "81c18fc8-43c0-4993-b297-c158ba94f076" (UID: "81c18fc8-43c0-4993-b297-c158ba94f076"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.860100 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/81c18fc8-43c0-4993-b297-c158ba94f076-pod-info" (OuterVolumeSpecName: "pod-info") pod "81c18fc8-43c0-4993-b297-c158ba94f076" (UID: "81c18fc8-43c0-4993-b297-c158ba94f076"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.860560 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81c18fc8-43c0-4993-b297-c158ba94f076-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "81c18fc8-43c0-4993-b297-c158ba94f076" (UID: "81c18fc8-43c0-4993-b297-c158ba94f076"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.860937 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e09d4bc8-436e-4faa-9416-3d76a569c97e-kube-api-access-slsd4" (OuterVolumeSpecName: "kube-api-access-slsd4") pod "e09d4bc8-436e-4faa-9416-3d76a569c97e" (UID: "e09d4bc8-436e-4faa-9416-3d76a569c97e"). InnerVolumeSpecName "kube-api-access-slsd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.861165 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81c18fc8-43c0-4993-b297-c158ba94f076-kube-api-access-cf9cg" (OuterVolumeSpecName: "kube-api-access-cf9cg") pod "81c18fc8-43c0-4993-b297-c158ba94f076" (UID: "81c18fc8-43c0-4993-b297-c158ba94f076"). InnerVolumeSpecName "kube-api-access-cf9cg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.869393 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8640b5e6-1bf2-4f88-af7e-f86af06afa4d" (OuterVolumeSpecName: "persistence") pod "81c18fc8-43c0-4993-b297-c158ba94f076" (UID: "81c18fc8-43c0-4993-b297-c158ba94f076"). InnerVolumeSpecName "pvc-8640b5e6-1bf2-4f88-af7e-f86af06afa4d". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.930717 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81c18fc8-43c0-4993-b297-c158ba94f076-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "81c18fc8-43c0-4993-b297-c158ba94f076" (UID: "81c18fc8-43c0-4993-b297-c158ba94f076"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.954278 4722 generic.go:334] "Generic (PLEG): container finished" podID="81c18fc8-43c0-4993-b297-c158ba94f076" containerID="f933ea6cacf864c31c7efc116b8e3e8a82c82fffa11d020ef9b1c9222164e3fb" exitCode=0 Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.954346 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/rabbitmq-server-0" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.954350 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/rabbitmq-server-0" event={"ID":"81c18fc8-43c0-4993-b297-c158ba94f076","Type":"ContainerDied","Data":"f933ea6cacf864c31c7efc116b8e3e8a82c82fffa11d020ef9b1c9222164e3fb"} Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.954402 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/rabbitmq-server-0" event={"ID":"81c18fc8-43c0-4993-b297-c158ba94f076","Type":"ContainerDied","Data":"b81787c02edc6462cb4ab799ca839e49e9429d7e87bd5eea9025347b98d6fca8"} Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.954421 4722 scope.go:117] "RemoveContainer" containerID="f933ea6cacf864c31c7efc116b8e3e8a82c82fffa11d020ef9b1c9222164e3fb" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.956498 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-54b885497-t8c9k" event={"ID":"8198b0a7-cf5a-41f0-a528-0f3833ff97a6","Type":"ContainerDied","Data":"56dd6ce64b50e82aa885784f29ea21a983bef1b28e5a012629c0774e58238368"} Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.956593 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-54b885497-t8c9k" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.957699 4722 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/81c18fc8-43c0-4993-b297-c158ba94f076-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.957731 4722 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/81c18fc8-43c0-4993-b297-c158ba94f076-pod-info\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.957749 4722 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/81c18fc8-43c0-4993-b297-c158ba94f076-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.957765 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slsd4\" (UniqueName: \"kubernetes.io/projected/e09d4bc8-436e-4faa-9416-3d76a569c97e-kube-api-access-slsd4\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.957811 4722 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-8640b5e6-1bf2-4f88-af7e-f86af06afa4d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8640b5e6-1bf2-4f88-af7e-f86af06afa4d\") on node \"crc\" " Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.957829 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/81c18fc8-43c0-4993-b297-c158ba94f076-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.957844 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/81c18fc8-43c0-4993-b297-c158ba94f076-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.957860 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/81c18fc8-43c0-4993-b297-c158ba94f076-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.957904 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cf9cg\" (UniqueName: \"kubernetes.io/projected/81c18fc8-43c0-4993-b297-c158ba94f076-kube-api-access-cf9cg\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.963242 4722 generic.go:334] "Generic (PLEG): container finished" podID="e09d4bc8-436e-4faa-9416-3d76a569c97e" containerID="e8f97d4cfb43093b0435d77d81e8da482d512dd24057fff7dfe5ca7914c33b82" exitCode=0 Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.963448 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-fsxkj" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.963396 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-fsxkj" event={"ID":"e09d4bc8-436e-4faa-9416-3d76a569c97e","Type":"ContainerDied","Data":"e8f97d4cfb43093b0435d77d81e8da482d512dd24057fff7dfe5ca7914c33b82"} Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.965045 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-fsxkj" event={"ID":"e09d4bc8-436e-4faa-9416-3d76a569c97e","Type":"ContainerDied","Data":"14bf8f5a9371cbcb71c3447895a32c6714d38c9a6235ff4c2b4f083d13b6d7a7"} Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.965142 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/memcached-0" event={"ID":"b4d42537-11aa-4a12-8137-feca0a8d889b","Type":"ContainerDied","Data":"276e0918463e11b8164acda9abbb1a7d0163a0fa86b696f8da313894e24e5619"} Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.965284 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/memcached-0" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.967939 4722 generic.go:334] "Generic (PLEG): container finished" podID="ff0e8efc-3b56-4f5e-9944-9e34a644c05e" containerID="dc6e86214e0e09921250cd96ef06a234f6e593f15fc9d4c3d81ec4ccc85b7cef" exitCode=0 Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.967968 4722 generic.go:334] "Generic (PLEG): container finished" podID="ff0e8efc-3b56-4f5e-9944-9e34a644c05e" containerID="32008f8b3c1c62a4235f2c06d26d4707b184904400b1763df6f2f6c48bd590d9" exitCode=0 Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.967994 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7dbf94768f-lxtzw" event={"ID":"ff0e8efc-3b56-4f5e-9944-9e34a644c05e","Type":"ContainerDied","Data":"dc6e86214e0e09921250cd96ef06a234f6e593f15fc9d4c3d81ec4ccc85b7cef"} Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.968057 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7dbf94768f-lxtzw" event={"ID":"ff0e8efc-3b56-4f5e-9944-9e34a644c05e","Type":"ContainerDied","Data":"32008f8b3c1c62a4235f2c06d26d4707b184904400b1763df6f2f6c48bd590d9"} Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.968074 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7dbf94768f-lxtzw" event={"ID":"ff0e8efc-3b56-4f5e-9944-9e34a644c05e","Type":"ContainerDied","Data":"00b53174e2fde26756a7ab16b1dba5af50a7710ba53394dfd075060f2a83a662"} Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.968138 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7dbf94768f-lxtzw" Oct 02 09:52:11 crc kubenswrapper[4722]: I1002 09:52:11.977506 4722 scope.go:117] "RemoveContainer" containerID="f03e2f12afe519ad73605106d743cbd0a8188c1a33eeb5a0dd817cce98eaa52d" Oct 02 09:52:12 crc kubenswrapper[4722]: I1002 09:52:12.001568 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/rabbitmq-server-0"] Oct 02 09:52:12 crc kubenswrapper[4722]: I1002 09:52:12.011475 4722 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 02 09:52:12 crc kubenswrapper[4722]: I1002 09:52:12.011792 4722 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-8640b5e6-1bf2-4f88-af7e-f86af06afa4d" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8640b5e6-1bf2-4f88-af7e-f86af06afa4d") on node "crc" Oct 02 09:52:12 crc kubenswrapper[4722]: I1002 09:52:12.012732 4722 scope.go:117] "RemoveContainer" containerID="f933ea6cacf864c31c7efc116b8e3e8a82c82fffa11d020ef9b1c9222164e3fb" Oct 02 09:52:12 crc kubenswrapper[4722]: I1002 09:52:12.012838 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/rabbitmq-server-0"] Oct 02 09:52:12 crc kubenswrapper[4722]: E1002 09:52:12.013663 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f933ea6cacf864c31c7efc116b8e3e8a82c82fffa11d020ef9b1c9222164e3fb\": container with ID starting with f933ea6cacf864c31c7efc116b8e3e8a82c82fffa11d020ef9b1c9222164e3fb not found: ID does not exist" containerID="f933ea6cacf864c31c7efc116b8e3e8a82c82fffa11d020ef9b1c9222164e3fb" Oct 02 09:52:12 crc kubenswrapper[4722]: I1002 09:52:12.013711 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f933ea6cacf864c31c7efc116b8e3e8a82c82fffa11d020ef9b1c9222164e3fb"} err="failed to get container status \"f933ea6cacf864c31c7efc116b8e3e8a82c82fffa11d020ef9b1c9222164e3fb\": rpc error: code = NotFound desc = could not find container \"f933ea6cacf864c31c7efc116b8e3e8a82c82fffa11d020ef9b1c9222164e3fb\": container with ID starting with f933ea6cacf864c31c7efc116b8e3e8a82c82fffa11d020ef9b1c9222164e3fb not found: ID does not exist" Oct 02 09:52:12 crc kubenswrapper[4722]: I1002 09:52:12.013747 4722 scope.go:117] "RemoveContainer" containerID="f03e2f12afe519ad73605106d743cbd0a8188c1a33eeb5a0dd817cce98eaa52d" Oct 02 09:52:12 crc kubenswrapper[4722]: E1002 09:52:12.014134 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f03e2f12afe519ad73605106d743cbd0a8188c1a33eeb5a0dd817cce98eaa52d\": container with ID starting with f03e2f12afe519ad73605106d743cbd0a8188c1a33eeb5a0dd817cce98eaa52d not found: ID does not exist" containerID="f03e2f12afe519ad73605106d743cbd0a8188c1a33eeb5a0dd817cce98eaa52d" Oct 02 09:52:12 crc kubenswrapper[4722]: I1002 09:52:12.014166 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f03e2f12afe519ad73605106d743cbd0a8188c1a33eeb5a0dd817cce98eaa52d"} err="failed to get container status \"f03e2f12afe519ad73605106d743cbd0a8188c1a33eeb5a0dd817cce98eaa52d\": rpc error: code = NotFound desc = could not find container \"f03e2f12afe519ad73605106d743cbd0a8188c1a33eeb5a0dd817cce98eaa52d\": container with ID starting with f03e2f12afe519ad73605106d743cbd0a8188c1a33eeb5a0dd817cce98eaa52d not found: ID does not exist" Oct 02 09:52:12 crc kubenswrapper[4722]: I1002 09:52:12.014208 4722 scope.go:117] "RemoveContainer" containerID="e329ca8245e6f8ad5d6e82c79296035a0ba51fcaabed81266e371466d8d27b9f" Oct 02 09:52:12 crc kubenswrapper[4722]: I1002 09:52:12.020990 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/keystone-54b885497-t8c9k"] Oct 02 09:52:12 crc kubenswrapper[4722]: I1002 09:52:12.031968 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/keystone-54b885497-t8c9k"] Oct 02 09:52:12 crc kubenswrapper[4722]: I1002 09:52:12.037841 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/barbican-operator-index-fsxkj"] Oct 02 09:52:12 crc kubenswrapper[4722]: I1002 09:52:12.042018 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/barbican-operator-index-fsxkj"] Oct 02 09:52:12 crc kubenswrapper[4722]: I1002 09:52:12.042780 4722 scope.go:117] "RemoveContainer" containerID="e8f97d4cfb43093b0435d77d81e8da482d512dd24057fff7dfe5ca7914c33b82" Oct 02 09:52:12 crc kubenswrapper[4722]: I1002 09:52:12.051523 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7dbf94768f-lxtzw"] Oct 02 09:52:12 crc kubenswrapper[4722]: I1002 09:52:12.059559 4722 reconciler_common.go:293] "Volume detached for volume \"pvc-8640b5e6-1bf2-4f88-af7e-f86af06afa4d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8640b5e6-1bf2-4f88-af7e-f86af06afa4d\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:12 crc kubenswrapper[4722]: I1002 09:52:12.060278 4722 scope.go:117] "RemoveContainer" containerID="e8f97d4cfb43093b0435d77d81e8da482d512dd24057fff7dfe5ca7914c33b82" Oct 02 09:52:12 crc kubenswrapper[4722]: I1002 09:52:12.061130 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7dbf94768f-lxtzw"] Oct 02 09:52:12 crc kubenswrapper[4722]: E1002 09:52:12.061200 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8f97d4cfb43093b0435d77d81e8da482d512dd24057fff7dfe5ca7914c33b82\": container with ID starting with e8f97d4cfb43093b0435d77d81e8da482d512dd24057fff7dfe5ca7914c33b82 not found: ID does not exist" containerID="e8f97d4cfb43093b0435d77d81e8da482d512dd24057fff7dfe5ca7914c33b82" Oct 02 09:52:12 crc kubenswrapper[4722]: I1002 09:52:12.061239 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8f97d4cfb43093b0435d77d81e8da482d512dd24057fff7dfe5ca7914c33b82"} err="failed to get container status \"e8f97d4cfb43093b0435d77d81e8da482d512dd24057fff7dfe5ca7914c33b82\": rpc error: code = NotFound desc = could not find container \"e8f97d4cfb43093b0435d77d81e8da482d512dd24057fff7dfe5ca7914c33b82\": container with ID starting with e8f97d4cfb43093b0435d77d81e8da482d512dd24057fff7dfe5ca7914c33b82 not found: ID does not exist" Oct 02 09:52:12 crc kubenswrapper[4722]: I1002 09:52:12.061277 4722 scope.go:117] "RemoveContainer" containerID="86b6af3ea1cebe16e80a7c7f66941e145bbcd18f9388440526dd59ea8766e63f" Oct 02 09:52:12 crc kubenswrapper[4722]: I1002 09:52:12.068699 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/memcached-0"] Oct 02 09:52:12 crc kubenswrapper[4722]: I1002 09:52:12.073510 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/memcached-0"] Oct 02 09:52:12 crc kubenswrapper[4722]: I1002 09:52:12.078922 4722 scope.go:117] "RemoveContainer" containerID="dc6e86214e0e09921250cd96ef06a234f6e593f15fc9d4c3d81ec4ccc85b7cef" Oct 02 09:52:12 crc kubenswrapper[4722]: I1002 09:52:12.098611 4722 scope.go:117] "RemoveContainer" containerID="32008f8b3c1c62a4235f2c06d26d4707b184904400b1763df6f2f6c48bd590d9" Oct 02 09:52:12 crc kubenswrapper[4722]: I1002 09:52:12.113037 4722 scope.go:117] "RemoveContainer" containerID="dc6e86214e0e09921250cd96ef06a234f6e593f15fc9d4c3d81ec4ccc85b7cef" Oct 02 09:52:12 crc kubenswrapper[4722]: E1002 09:52:12.113506 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc6e86214e0e09921250cd96ef06a234f6e593f15fc9d4c3d81ec4ccc85b7cef\": container with ID starting with dc6e86214e0e09921250cd96ef06a234f6e593f15fc9d4c3d81ec4ccc85b7cef not found: ID does not exist" containerID="dc6e86214e0e09921250cd96ef06a234f6e593f15fc9d4c3d81ec4ccc85b7cef" Oct 02 09:52:12 crc kubenswrapper[4722]: I1002 09:52:12.113538 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc6e86214e0e09921250cd96ef06a234f6e593f15fc9d4c3d81ec4ccc85b7cef"} err="failed to get container status \"dc6e86214e0e09921250cd96ef06a234f6e593f15fc9d4c3d81ec4ccc85b7cef\": rpc error: code = NotFound desc = could not find container \"dc6e86214e0e09921250cd96ef06a234f6e593f15fc9d4c3d81ec4ccc85b7cef\": container with ID starting with dc6e86214e0e09921250cd96ef06a234f6e593f15fc9d4c3d81ec4ccc85b7cef not found: ID does not exist" Oct 02 09:52:12 crc kubenswrapper[4722]: I1002 09:52:12.113567 4722 scope.go:117] "RemoveContainer" containerID="32008f8b3c1c62a4235f2c06d26d4707b184904400b1763df6f2f6c48bd590d9" Oct 02 09:52:12 crc kubenswrapper[4722]: E1002 09:52:12.113794 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32008f8b3c1c62a4235f2c06d26d4707b184904400b1763df6f2f6c48bd590d9\": container with ID starting with 32008f8b3c1c62a4235f2c06d26d4707b184904400b1763df6f2f6c48bd590d9 not found: ID does not exist" containerID="32008f8b3c1c62a4235f2c06d26d4707b184904400b1763df6f2f6c48bd590d9" Oct 02 09:52:12 crc kubenswrapper[4722]: I1002 09:52:12.113817 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32008f8b3c1c62a4235f2c06d26d4707b184904400b1763df6f2f6c48bd590d9"} err="failed to get container status \"32008f8b3c1c62a4235f2c06d26d4707b184904400b1763df6f2f6c48bd590d9\": rpc error: code = NotFound desc = could not find container \"32008f8b3c1c62a4235f2c06d26d4707b184904400b1763df6f2f6c48bd590d9\": container with ID starting with 32008f8b3c1c62a4235f2c06d26d4707b184904400b1763df6f2f6c48bd590d9 not found: ID does not exist" Oct 02 09:52:12 crc kubenswrapper[4722]: I1002 09:52:12.113829 4722 scope.go:117] "RemoveContainer" containerID="dc6e86214e0e09921250cd96ef06a234f6e593f15fc9d4c3d81ec4ccc85b7cef" Oct 02 09:52:12 crc kubenswrapper[4722]: I1002 09:52:12.114032 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc6e86214e0e09921250cd96ef06a234f6e593f15fc9d4c3d81ec4ccc85b7cef"} err="failed to get container status \"dc6e86214e0e09921250cd96ef06a234f6e593f15fc9d4c3d81ec4ccc85b7cef\": rpc error: code = NotFound desc = could not find container \"dc6e86214e0e09921250cd96ef06a234f6e593f15fc9d4c3d81ec4ccc85b7cef\": container with ID starting with dc6e86214e0e09921250cd96ef06a234f6e593f15fc9d4c3d81ec4ccc85b7cef not found: ID does not exist" Oct 02 09:52:12 crc kubenswrapper[4722]: I1002 09:52:12.114112 4722 scope.go:117] "RemoveContainer" containerID="32008f8b3c1c62a4235f2c06d26d4707b184904400b1763df6f2f6c48bd590d9" Oct 02 09:52:12 crc kubenswrapper[4722]: I1002 09:52:12.114415 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32008f8b3c1c62a4235f2c06d26d4707b184904400b1763df6f2f6c48bd590d9"} err="failed to get container status \"32008f8b3c1c62a4235f2c06d26d4707b184904400b1763df6f2f6c48bd590d9\": rpc error: code = NotFound desc = could not find container \"32008f8b3c1c62a4235f2c06d26d4707b184904400b1763df6f2f6c48bd590d9\": container with ID starting with 32008f8b3c1c62a4235f2c06d26d4707b184904400b1763df6f2f6c48bd590d9 not found: ID does not exist" Oct 02 09:52:12 crc kubenswrapper[4722]: I1002 09:52:12.707764 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/openstack-galera-0" podUID="9103aa77-3c03-4ece-8293-b6ccd66a0cbb" containerName="galera" containerID="cri-o://619f84a5809b1668a3059ede311f159e1c13246085df2510df54360dd2ef4078" gracePeriod=26 Oct 02 09:52:12 crc kubenswrapper[4722]: I1002 09:52:12.720709 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52f6efda-6992-4bdd-9da0-b7cc70266f05" path="/var/lib/kubelet/pods/52f6efda-6992-4bdd-9da0-b7cc70266f05/volumes" Oct 02 09:52:12 crc kubenswrapper[4722]: I1002 09:52:12.721584 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8198b0a7-cf5a-41f0-a528-0f3833ff97a6" path="/var/lib/kubelet/pods/8198b0a7-cf5a-41f0-a528-0f3833ff97a6/volumes" Oct 02 09:52:12 crc kubenswrapper[4722]: I1002 09:52:12.722442 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81c18fc8-43c0-4993-b297-c158ba94f076" path="/var/lib/kubelet/pods/81c18fc8-43c0-4993-b297-c158ba94f076/volumes" Oct 02 09:52:12 crc kubenswrapper[4722]: I1002 09:52:12.723680 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4d42537-11aa-4a12-8137-feca0a8d889b" path="/var/lib/kubelet/pods/b4d42537-11aa-4a12-8137-feca0a8d889b/volumes" Oct 02 09:52:12 crc kubenswrapper[4722]: I1002 09:52:12.724203 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e09d4bc8-436e-4faa-9416-3d76a569c97e" path="/var/lib/kubelet/pods/e09d4bc8-436e-4faa-9416-3d76a569c97e/volumes" Oct 02 09:52:12 crc kubenswrapper[4722]: I1002 09:52:12.724803 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff0e8efc-3b56-4f5e-9944-9e34a644c05e" path="/var/lib/kubelet/pods/ff0e8efc-3b56-4f5e-9944-9e34a644c05e/volumes" Oct 02 09:52:12 crc kubenswrapper[4722]: I1002 09:52:12.975457 4722 generic.go:334] "Generic (PLEG): container finished" podID="4383f5f8-eb32-4c6b-a3a5-0943a567e856" containerID="2230e2db400f30c4e67d870f42bef7d57c323d5b014b2863ebfd568758552b96" exitCode=0 Oct 02 09:52:12 crc kubenswrapper[4722]: I1002 09:52:12.975518 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-1" event={"ID":"4383f5f8-eb32-4c6b-a3a5-0943a567e856","Type":"ContainerDied","Data":"2230e2db400f30c4e67d870f42bef7d57c323d5b014b2863ebfd568758552b96"} Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.050292 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6979dbf55-qtk97"] Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.050621 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-controller-manager-6979dbf55-qtk97" podUID="df9a46f7-6264-4bc4-b5f2-87a84a38a06c" containerName="kube-rbac-proxy" containerID="cri-o://049711e4fe09d7f2a7f42fc6efebd5c486068ad4814d1ec7e9eb4c4ad2dd91d0" gracePeriod=10 Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.050803 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-controller-manager-6979dbf55-qtk97" podUID="df9a46f7-6264-4bc4-b5f2-87a84a38a06c" containerName="manager" containerID="cri-o://47f380c69b25256dbdac871d4dc8ab43d1779330e735aa503048342a88008a72" gracePeriod=10 Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.349252 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-zldfm"] Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.350081 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-index-zldfm" podUID="d5efc7dc-5ee4-4e70-ad78-acd076852fcb" containerName="registry-server" containerID="cri-o://c3826aec3d73b676886a02ab3e280a6ccdfaf25594251c6b91493a857d5d2659" gracePeriod=30 Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.388511 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436n66v9"] Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.404935 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436n66v9"] Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.530593 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-1" Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.583701 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4383f5f8-eb32-4c6b-a3a5-0943a567e856-operator-scripts\") pod \"4383f5f8-eb32-4c6b-a3a5-0943a567e856\" (UID: \"4383f5f8-eb32-4c6b-a3a5-0943a567e856\") " Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.583829 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"4383f5f8-eb32-4c6b-a3a5-0943a567e856\" (UID: \"4383f5f8-eb32-4c6b-a3a5-0943a567e856\") " Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.583903 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/4383f5f8-eb32-4c6b-a3a5-0943a567e856-secrets\") pod \"4383f5f8-eb32-4c6b-a3a5-0943a567e856\" (UID: \"4383f5f8-eb32-4c6b-a3a5-0943a567e856\") " Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.583928 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4383f5f8-eb32-4c6b-a3a5-0943a567e856-config-data-default\") pod \"4383f5f8-eb32-4c6b-a3a5-0943a567e856\" (UID: \"4383f5f8-eb32-4c6b-a3a5-0943a567e856\") " Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.583952 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsv5d\" (UniqueName: \"kubernetes.io/projected/4383f5f8-eb32-4c6b-a3a5-0943a567e856-kube-api-access-gsv5d\") pod \"4383f5f8-eb32-4c6b-a3a5-0943a567e856\" (UID: \"4383f5f8-eb32-4c6b-a3a5-0943a567e856\") " Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.583975 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4383f5f8-eb32-4c6b-a3a5-0943a567e856-kolla-config\") pod \"4383f5f8-eb32-4c6b-a3a5-0943a567e856\" (UID: \"4383f5f8-eb32-4c6b-a3a5-0943a567e856\") " Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.584057 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4383f5f8-eb32-4c6b-a3a5-0943a567e856-config-data-generated\") pod \"4383f5f8-eb32-4c6b-a3a5-0943a567e856\" (UID: \"4383f5f8-eb32-4c6b-a3a5-0943a567e856\") " Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.587688 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4383f5f8-eb32-4c6b-a3a5-0943a567e856-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "4383f5f8-eb32-4c6b-a3a5-0943a567e856" (UID: "4383f5f8-eb32-4c6b-a3a5-0943a567e856"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.591263 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4383f5f8-eb32-4c6b-a3a5-0943a567e856-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "4383f5f8-eb32-4c6b-a3a5-0943a567e856" (UID: "4383f5f8-eb32-4c6b-a3a5-0943a567e856"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.592482 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4383f5f8-eb32-4c6b-a3a5-0943a567e856-secrets" (OuterVolumeSpecName: "secrets") pod "4383f5f8-eb32-4c6b-a3a5-0943a567e856" (UID: "4383f5f8-eb32-4c6b-a3a5-0943a567e856"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.592808 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4383f5f8-eb32-4c6b-a3a5-0943a567e856-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4383f5f8-eb32-4c6b-a3a5-0943a567e856" (UID: "4383f5f8-eb32-4c6b-a3a5-0943a567e856"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.593577 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4383f5f8-eb32-4c6b-a3a5-0943a567e856-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "4383f5f8-eb32-4c6b-a3a5-0943a567e856" (UID: "4383f5f8-eb32-4c6b-a3a5-0943a567e856"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.597537 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4383f5f8-eb32-4c6b-a3a5-0943a567e856-kube-api-access-gsv5d" (OuterVolumeSpecName: "kube-api-access-gsv5d") pod "4383f5f8-eb32-4c6b-a3a5-0943a567e856" (UID: "4383f5f8-eb32-4c6b-a3a5-0943a567e856"). InnerVolumeSpecName "kube-api-access-gsv5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.601884 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "mysql-db") pod "4383f5f8-eb32-4c6b-a3a5-0943a567e856" (UID: "4383f5f8-eb32-4c6b-a3a5-0943a567e856"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.617809 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-6979dbf55-qtk97" Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.687450 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8bmx\" (UniqueName: \"kubernetes.io/projected/df9a46f7-6264-4bc4-b5f2-87a84a38a06c-kube-api-access-t8bmx\") pod \"df9a46f7-6264-4bc4-b5f2-87a84a38a06c\" (UID: \"df9a46f7-6264-4bc4-b5f2-87a84a38a06c\") " Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.687565 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/df9a46f7-6264-4bc4-b5f2-87a84a38a06c-webhook-cert\") pod \"df9a46f7-6264-4bc4-b5f2-87a84a38a06c\" (UID: \"df9a46f7-6264-4bc4-b5f2-87a84a38a06c\") " Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.687614 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/df9a46f7-6264-4bc4-b5f2-87a84a38a06c-apiservice-cert\") pod \"df9a46f7-6264-4bc4-b5f2-87a84a38a06c\" (UID: \"df9a46f7-6264-4bc4-b5f2-87a84a38a06c\") " Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.690656 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df9a46f7-6264-4bc4-b5f2-87a84a38a06c-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "df9a46f7-6264-4bc4-b5f2-87a84a38a06c" (UID: "df9a46f7-6264-4bc4-b5f2-87a84a38a06c"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.693412 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df9a46f7-6264-4bc4-b5f2-87a84a38a06c-kube-api-access-t8bmx" (OuterVolumeSpecName: "kube-api-access-t8bmx") pod "df9a46f7-6264-4bc4-b5f2-87a84a38a06c" (UID: "df9a46f7-6264-4bc4-b5f2-87a84a38a06c"). InnerVolumeSpecName "kube-api-access-t8bmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.694166 4722 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.694207 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8bmx\" (UniqueName: \"kubernetes.io/projected/df9a46f7-6264-4bc4-b5f2-87a84a38a06c-kube-api-access-t8bmx\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.694236 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4383f5f8-eb32-4c6b-a3a5-0943a567e856-config-data-default\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.694249 4722 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/4383f5f8-eb32-4c6b-a3a5-0943a567e856-secrets\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.694261 4722 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4383f5f8-eb32-4c6b-a3a5-0943a567e856-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.694273 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsv5d\" (UniqueName: \"kubernetes.io/projected/4383f5f8-eb32-4c6b-a3a5-0943a567e856-kube-api-access-gsv5d\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.694291 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4383f5f8-eb32-4c6b-a3a5-0943a567e856-config-data-generated\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.694404 4722 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/df9a46f7-6264-4bc4-b5f2-87a84a38a06c-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.694444 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4383f5f8-eb32-4c6b-a3a5-0943a567e856-operator-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.701961 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-0" Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.698666 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df9a46f7-6264-4bc4-b5f2-87a84a38a06c-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "df9a46f7-6264-4bc4-b5f2-87a84a38a06c" (UID: "df9a46f7-6264-4bc4-b5f2-87a84a38a06c"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.712567 4722 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.792776 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-zldfm" Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.795231 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/9103aa77-3c03-4ece-8293-b6ccd66a0cbb-secrets\") pod \"9103aa77-3c03-4ece-8293-b6ccd66a0cbb\" (UID: \"9103aa77-3c03-4ece-8293-b6ccd66a0cbb\") " Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.795310 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dw79b\" (UniqueName: \"kubernetes.io/projected/9103aa77-3c03-4ece-8293-b6ccd66a0cbb-kube-api-access-dw79b\") pod \"9103aa77-3c03-4ece-8293-b6ccd66a0cbb\" (UID: \"9103aa77-3c03-4ece-8293-b6ccd66a0cbb\") " Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.795387 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9103aa77-3c03-4ece-8293-b6ccd66a0cbb-config-data-default\") pod \"9103aa77-3c03-4ece-8293-b6ccd66a0cbb\" (UID: \"9103aa77-3c03-4ece-8293-b6ccd66a0cbb\") " Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.795436 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"9103aa77-3c03-4ece-8293-b6ccd66a0cbb\" (UID: \"9103aa77-3c03-4ece-8293-b6ccd66a0cbb\") " Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.795461 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9103aa77-3c03-4ece-8293-b6ccd66a0cbb-operator-scripts\") pod \"9103aa77-3c03-4ece-8293-b6ccd66a0cbb\" (UID: \"9103aa77-3c03-4ece-8293-b6ccd66a0cbb\") " Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.795496 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9103aa77-3c03-4ece-8293-b6ccd66a0cbb-config-data-generated\") pod \"9103aa77-3c03-4ece-8293-b6ccd66a0cbb\" (UID: \"9103aa77-3c03-4ece-8293-b6ccd66a0cbb\") " Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.795537 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9103aa77-3c03-4ece-8293-b6ccd66a0cbb-kolla-config\") pod \"9103aa77-3c03-4ece-8293-b6ccd66a0cbb\" (UID: \"9103aa77-3c03-4ece-8293-b6ccd66a0cbb\") " Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.796132 4722 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.796151 4722 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/df9a46f7-6264-4bc4-b5f2-87a84a38a06c-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.796144 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9103aa77-3c03-4ece-8293-b6ccd66a0cbb-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "9103aa77-3c03-4ece-8293-b6ccd66a0cbb" (UID: "9103aa77-3c03-4ece-8293-b6ccd66a0cbb"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.799550 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9103aa77-3c03-4ece-8293-b6ccd66a0cbb-secrets" (OuterVolumeSpecName: "secrets") pod "9103aa77-3c03-4ece-8293-b6ccd66a0cbb" (UID: "9103aa77-3c03-4ece-8293-b6ccd66a0cbb"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.802174 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9103aa77-3c03-4ece-8293-b6ccd66a0cbb-kube-api-access-dw79b" (OuterVolumeSpecName: "kube-api-access-dw79b") pod "9103aa77-3c03-4ece-8293-b6ccd66a0cbb" (UID: "9103aa77-3c03-4ece-8293-b6ccd66a0cbb"). InnerVolumeSpecName "kube-api-access-dw79b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.803308 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9103aa77-3c03-4ece-8293-b6ccd66a0cbb-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "9103aa77-3c03-4ece-8293-b6ccd66a0cbb" (UID: "9103aa77-3c03-4ece-8293-b6ccd66a0cbb"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.804094 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9103aa77-3c03-4ece-8293-b6ccd66a0cbb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9103aa77-3c03-4ece-8293-b6ccd66a0cbb" (UID: "9103aa77-3c03-4ece-8293-b6ccd66a0cbb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.804360 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "mysql-db") pod "9103aa77-3c03-4ece-8293-b6ccd66a0cbb" (UID: "9103aa77-3c03-4ece-8293-b6ccd66a0cbb"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.804528 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9103aa77-3c03-4ece-8293-b6ccd66a0cbb-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "9103aa77-3c03-4ece-8293-b6ccd66a0cbb" (UID: "9103aa77-3c03-4ece-8293-b6ccd66a0cbb"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.897643 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7j595\" (UniqueName: \"kubernetes.io/projected/d5efc7dc-5ee4-4e70-ad78-acd076852fcb-kube-api-access-7j595\") pod \"d5efc7dc-5ee4-4e70-ad78-acd076852fcb\" (UID: \"d5efc7dc-5ee4-4e70-ad78-acd076852fcb\") " Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.898465 4722 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9103aa77-3c03-4ece-8293-b6ccd66a0cbb-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.898540 4722 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/9103aa77-3c03-4ece-8293-b6ccd66a0cbb-secrets\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.898554 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dw79b\" (UniqueName: \"kubernetes.io/projected/9103aa77-3c03-4ece-8293-b6ccd66a0cbb-kube-api-access-dw79b\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.898567 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9103aa77-3c03-4ece-8293-b6ccd66a0cbb-config-data-default\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.898599 4722 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.898612 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9103aa77-3c03-4ece-8293-b6ccd66a0cbb-operator-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.898624 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9103aa77-3c03-4ece-8293-b6ccd66a0cbb-config-data-generated\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.900626 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5efc7dc-5ee4-4e70-ad78-acd076852fcb-kube-api-access-7j595" (OuterVolumeSpecName: "kube-api-access-7j595") pod "d5efc7dc-5ee4-4e70-ad78-acd076852fcb" (UID: "d5efc7dc-5ee4-4e70-ad78-acd076852fcb"). InnerVolumeSpecName "kube-api-access-7j595". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.909016 4722 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.989245 4722 generic.go:334] "Generic (PLEG): container finished" podID="d5efc7dc-5ee4-4e70-ad78-acd076852fcb" containerID="c3826aec3d73b676886a02ab3e280a6ccdfaf25594251c6b91493a857d5d2659" exitCode=0 Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.989277 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-zldfm" Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.989327 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-zldfm" event={"ID":"d5efc7dc-5ee4-4e70-ad78-acd076852fcb","Type":"ContainerDied","Data":"c3826aec3d73b676886a02ab3e280a6ccdfaf25594251c6b91493a857d5d2659"} Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.989360 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-zldfm" event={"ID":"d5efc7dc-5ee4-4e70-ad78-acd076852fcb","Type":"ContainerDied","Data":"a0012f15a55016abbbd2a428e70886543be73b122284d4bc316f32d32814b6d7"} Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.989384 4722 scope.go:117] "RemoveContainer" containerID="c3826aec3d73b676886a02ab3e280a6ccdfaf25594251c6b91493a857d5d2659" Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.991397 4722 generic.go:334] "Generic (PLEG): container finished" podID="9103aa77-3c03-4ece-8293-b6ccd66a0cbb" containerID="619f84a5809b1668a3059ede311f159e1c13246085df2510df54360dd2ef4078" exitCode=0 Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.991443 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-0" event={"ID":"9103aa77-3c03-4ece-8293-b6ccd66a0cbb","Type":"ContainerDied","Data":"619f84a5809b1668a3059ede311f159e1c13246085df2510df54360dd2ef4078"} Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.991462 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-0" event={"ID":"9103aa77-3c03-4ece-8293-b6ccd66a0cbb","Type":"ContainerDied","Data":"e28c0db862557fe74826f533b34b811ddf23021242af250ea413cb4a9fcaeed7"} Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.991507 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-0" Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.993488 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-1" event={"ID":"4383f5f8-eb32-4c6b-a3a5-0943a567e856","Type":"ContainerDied","Data":"286716b78b4286c5d54750dc243736f103ba7d7e2f8db7d3696f69153d8a37c1"} Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.993515 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-1" Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.995507 4722 generic.go:334] "Generic (PLEG): container finished" podID="df9a46f7-6264-4bc4-b5f2-87a84a38a06c" containerID="049711e4fe09d7f2a7f42fc6efebd5c486068ad4814d1ec7e9eb4c4ad2dd91d0" exitCode=0 Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.995534 4722 generic.go:334] "Generic (PLEG): container finished" podID="df9a46f7-6264-4bc4-b5f2-87a84a38a06c" containerID="47f380c69b25256dbdac871d4dc8ab43d1779330e735aa503048342a88008a72" exitCode=0 Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.995567 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-6979dbf55-qtk97" Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.995553 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-6979dbf55-qtk97" event={"ID":"df9a46f7-6264-4bc4-b5f2-87a84a38a06c","Type":"ContainerDied","Data":"049711e4fe09d7f2a7f42fc6efebd5c486068ad4814d1ec7e9eb4c4ad2dd91d0"} Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.995610 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-6979dbf55-qtk97" event={"ID":"df9a46f7-6264-4bc4-b5f2-87a84a38a06c","Type":"ContainerDied","Data":"47f380c69b25256dbdac871d4dc8ab43d1779330e735aa503048342a88008a72"} Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.995625 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-6979dbf55-qtk97" event={"ID":"df9a46f7-6264-4bc4-b5f2-87a84a38a06c","Type":"ContainerDied","Data":"e92861c3ac290b36600758812a0faa9cef1c3b06521e25f64674998cadf3895a"} Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.999869 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7j595\" (UniqueName: \"kubernetes.io/projected/d5efc7dc-5ee4-4e70-ad78-acd076852fcb-kube-api-access-7j595\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:13 crc kubenswrapper[4722]: I1002 09:52:13.999895 4722 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:14 crc kubenswrapper[4722]: I1002 09:52:14.017276 4722 scope.go:117] "RemoveContainer" containerID="c3826aec3d73b676886a02ab3e280a6ccdfaf25594251c6b91493a857d5d2659" Oct 02 09:52:14 crc kubenswrapper[4722]: I1002 09:52:14.017666 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-zldfm"] Oct 02 09:52:14 crc kubenswrapper[4722]: E1002 09:52:14.018002 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3826aec3d73b676886a02ab3e280a6ccdfaf25594251c6b91493a857d5d2659\": container with ID starting with c3826aec3d73b676886a02ab3e280a6ccdfaf25594251c6b91493a857d5d2659 not found: ID does not exist" containerID="c3826aec3d73b676886a02ab3e280a6ccdfaf25594251c6b91493a857d5d2659" Oct 02 09:52:14 crc kubenswrapper[4722]: I1002 09:52:14.018046 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3826aec3d73b676886a02ab3e280a6ccdfaf25594251c6b91493a857d5d2659"} err="failed to get container status \"c3826aec3d73b676886a02ab3e280a6ccdfaf25594251c6b91493a857d5d2659\": rpc error: code = NotFound desc = could not find container \"c3826aec3d73b676886a02ab3e280a6ccdfaf25594251c6b91493a857d5d2659\": container with ID starting with c3826aec3d73b676886a02ab3e280a6ccdfaf25594251c6b91493a857d5d2659 not found: ID does not exist" Oct 02 09:52:14 crc kubenswrapper[4722]: I1002 09:52:14.018073 4722 scope.go:117] "RemoveContainer" containerID="619f84a5809b1668a3059ede311f159e1c13246085df2510df54360dd2ef4078" Oct 02 09:52:14 crc kubenswrapper[4722]: I1002 09:52:14.022223 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-index-zldfm"] Oct 02 09:52:14 crc kubenswrapper[4722]: I1002 09:52:14.037990 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/openstack-galera-0"] Oct 02 09:52:14 crc kubenswrapper[4722]: I1002 09:52:14.044547 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/openstack-galera-0"] Oct 02 09:52:14 crc kubenswrapper[4722]: I1002 09:52:14.054690 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6979dbf55-qtk97"] Oct 02 09:52:14 crc kubenswrapper[4722]: I1002 09:52:14.060789 4722 scope.go:117] "RemoveContainer" containerID="965305ed0523398949d9dbb1145606f185b4cc3da22f527a1e578d69692b11f8" Oct 02 09:52:14 crc kubenswrapper[4722]: I1002 09:52:14.064457 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6979dbf55-qtk97"] Oct 02 09:52:14 crc kubenswrapper[4722]: I1002 09:52:14.070733 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/openstack-galera-1"] Oct 02 09:52:14 crc kubenswrapper[4722]: I1002 09:52:14.074624 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/openstack-galera-1"] Oct 02 09:52:14 crc kubenswrapper[4722]: I1002 09:52:14.084966 4722 scope.go:117] "RemoveContainer" containerID="619f84a5809b1668a3059ede311f159e1c13246085df2510df54360dd2ef4078" Oct 02 09:52:14 crc kubenswrapper[4722]: E1002 09:52:14.085470 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"619f84a5809b1668a3059ede311f159e1c13246085df2510df54360dd2ef4078\": container with ID starting with 619f84a5809b1668a3059ede311f159e1c13246085df2510df54360dd2ef4078 not found: ID does not exist" containerID="619f84a5809b1668a3059ede311f159e1c13246085df2510df54360dd2ef4078" Oct 02 09:52:14 crc kubenswrapper[4722]: I1002 09:52:14.085505 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"619f84a5809b1668a3059ede311f159e1c13246085df2510df54360dd2ef4078"} err="failed to get container status \"619f84a5809b1668a3059ede311f159e1c13246085df2510df54360dd2ef4078\": rpc error: code = NotFound desc = could not find container \"619f84a5809b1668a3059ede311f159e1c13246085df2510df54360dd2ef4078\": container with ID starting with 619f84a5809b1668a3059ede311f159e1c13246085df2510df54360dd2ef4078 not found: ID does not exist" Oct 02 09:52:14 crc kubenswrapper[4722]: I1002 09:52:14.085540 4722 scope.go:117] "RemoveContainer" containerID="965305ed0523398949d9dbb1145606f185b4cc3da22f527a1e578d69692b11f8" Oct 02 09:52:14 crc kubenswrapper[4722]: E1002 09:52:14.086486 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"965305ed0523398949d9dbb1145606f185b4cc3da22f527a1e578d69692b11f8\": container with ID starting with 965305ed0523398949d9dbb1145606f185b4cc3da22f527a1e578d69692b11f8 not found: ID does not exist" containerID="965305ed0523398949d9dbb1145606f185b4cc3da22f527a1e578d69692b11f8" Oct 02 09:52:14 crc kubenswrapper[4722]: I1002 09:52:14.086636 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"965305ed0523398949d9dbb1145606f185b4cc3da22f527a1e578d69692b11f8"} err="failed to get container status \"965305ed0523398949d9dbb1145606f185b4cc3da22f527a1e578d69692b11f8\": rpc error: code = NotFound desc = could not find container \"965305ed0523398949d9dbb1145606f185b4cc3da22f527a1e578d69692b11f8\": container with ID starting with 965305ed0523398949d9dbb1145606f185b4cc3da22f527a1e578d69692b11f8 not found: ID does not exist" Oct 02 09:52:14 crc kubenswrapper[4722]: I1002 09:52:14.086750 4722 scope.go:117] "RemoveContainer" containerID="2230e2db400f30c4e67d870f42bef7d57c323d5b014b2863ebfd568758552b96" Oct 02 09:52:14 crc kubenswrapper[4722]: I1002 09:52:14.103546 4722 scope.go:117] "RemoveContainer" containerID="014a88a289d5137c4d5f9dff03cce6ac80c19ec64858e10f0244424e9dbdcbc2" Oct 02 09:52:14 crc kubenswrapper[4722]: I1002 09:52:14.122650 4722 scope.go:117] "RemoveContainer" containerID="049711e4fe09d7f2a7f42fc6efebd5c486068ad4814d1ec7e9eb4c4ad2dd91d0" Oct 02 09:52:14 crc kubenswrapper[4722]: I1002 09:52:14.138815 4722 scope.go:117] "RemoveContainer" containerID="47f380c69b25256dbdac871d4dc8ab43d1779330e735aa503048342a88008a72" Oct 02 09:52:14 crc kubenswrapper[4722]: I1002 09:52:14.155900 4722 scope.go:117] "RemoveContainer" containerID="049711e4fe09d7f2a7f42fc6efebd5c486068ad4814d1ec7e9eb4c4ad2dd91d0" Oct 02 09:52:14 crc kubenswrapper[4722]: E1002 09:52:14.156436 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"049711e4fe09d7f2a7f42fc6efebd5c486068ad4814d1ec7e9eb4c4ad2dd91d0\": container with ID starting with 049711e4fe09d7f2a7f42fc6efebd5c486068ad4814d1ec7e9eb4c4ad2dd91d0 not found: ID does not exist" containerID="049711e4fe09d7f2a7f42fc6efebd5c486068ad4814d1ec7e9eb4c4ad2dd91d0" Oct 02 09:52:14 crc kubenswrapper[4722]: I1002 09:52:14.156470 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"049711e4fe09d7f2a7f42fc6efebd5c486068ad4814d1ec7e9eb4c4ad2dd91d0"} err="failed to get container status \"049711e4fe09d7f2a7f42fc6efebd5c486068ad4814d1ec7e9eb4c4ad2dd91d0\": rpc error: code = NotFound desc = could not find container \"049711e4fe09d7f2a7f42fc6efebd5c486068ad4814d1ec7e9eb4c4ad2dd91d0\": container with ID starting with 049711e4fe09d7f2a7f42fc6efebd5c486068ad4814d1ec7e9eb4c4ad2dd91d0 not found: ID does not exist" Oct 02 09:52:14 crc kubenswrapper[4722]: I1002 09:52:14.156494 4722 scope.go:117] "RemoveContainer" containerID="47f380c69b25256dbdac871d4dc8ab43d1779330e735aa503048342a88008a72" Oct 02 09:52:14 crc kubenswrapper[4722]: E1002 09:52:14.156777 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47f380c69b25256dbdac871d4dc8ab43d1779330e735aa503048342a88008a72\": container with ID starting with 47f380c69b25256dbdac871d4dc8ab43d1779330e735aa503048342a88008a72 not found: ID does not exist" containerID="47f380c69b25256dbdac871d4dc8ab43d1779330e735aa503048342a88008a72" Oct 02 09:52:14 crc kubenswrapper[4722]: I1002 09:52:14.156799 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47f380c69b25256dbdac871d4dc8ab43d1779330e735aa503048342a88008a72"} err="failed to get container status \"47f380c69b25256dbdac871d4dc8ab43d1779330e735aa503048342a88008a72\": rpc error: code = NotFound desc = could not find container \"47f380c69b25256dbdac871d4dc8ab43d1779330e735aa503048342a88008a72\": container with ID starting with 47f380c69b25256dbdac871d4dc8ab43d1779330e735aa503048342a88008a72 not found: ID does not exist" Oct 02 09:52:14 crc kubenswrapper[4722]: I1002 09:52:14.156815 4722 scope.go:117] "RemoveContainer" containerID="049711e4fe09d7f2a7f42fc6efebd5c486068ad4814d1ec7e9eb4c4ad2dd91d0" Oct 02 09:52:14 crc kubenswrapper[4722]: I1002 09:52:14.157129 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"049711e4fe09d7f2a7f42fc6efebd5c486068ad4814d1ec7e9eb4c4ad2dd91d0"} err="failed to get container status \"049711e4fe09d7f2a7f42fc6efebd5c486068ad4814d1ec7e9eb4c4ad2dd91d0\": rpc error: code = NotFound desc = could not find container \"049711e4fe09d7f2a7f42fc6efebd5c486068ad4814d1ec7e9eb4c4ad2dd91d0\": container with ID starting with 049711e4fe09d7f2a7f42fc6efebd5c486068ad4814d1ec7e9eb4c4ad2dd91d0 not found: ID does not exist" Oct 02 09:52:14 crc kubenswrapper[4722]: I1002 09:52:14.157227 4722 scope.go:117] "RemoveContainer" containerID="47f380c69b25256dbdac871d4dc8ab43d1779330e735aa503048342a88008a72" Oct 02 09:52:14 crc kubenswrapper[4722]: I1002 09:52:14.157671 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47f380c69b25256dbdac871d4dc8ab43d1779330e735aa503048342a88008a72"} err="failed to get container status \"47f380c69b25256dbdac871d4dc8ab43d1779330e735aa503048342a88008a72\": rpc error: code = NotFound desc = could not find container \"47f380c69b25256dbdac871d4dc8ab43d1779330e735aa503048342a88008a72\": container with ID starting with 47f380c69b25256dbdac871d4dc8ab43d1779330e735aa503048342a88008a72 not found: ID does not exist" Oct 02 09:52:14 crc kubenswrapper[4722]: I1002 09:52:14.719180 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4383f5f8-eb32-4c6b-a3a5-0943a567e856" path="/var/lib/kubelet/pods/4383f5f8-eb32-4c6b-a3a5-0943a567e856/volumes" Oct 02 09:52:14 crc kubenswrapper[4722]: I1002 09:52:14.721234 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9103aa77-3c03-4ece-8293-b6ccd66a0cbb" path="/var/lib/kubelet/pods/9103aa77-3c03-4ece-8293-b6ccd66a0cbb/volumes" Oct 02 09:52:14 crc kubenswrapper[4722]: I1002 09:52:14.722346 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba9a9776-4156-4177-a093-f358250b4552" path="/var/lib/kubelet/pods/ba9a9776-4156-4177-a093-f358250b4552/volumes" Oct 02 09:52:14 crc kubenswrapper[4722]: I1002 09:52:14.726573 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5efc7dc-5ee4-4e70-ad78-acd076852fcb" path="/var/lib/kubelet/pods/d5efc7dc-5ee4-4e70-ad78-acd076852fcb/volumes" Oct 02 09:52:14 crc kubenswrapper[4722]: I1002 09:52:14.727715 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df9a46f7-6264-4bc4-b5f2-87a84a38a06c" path="/var/lib/kubelet/pods/df9a46f7-6264-4bc4-b5f2-87a84a38a06c/volumes" Oct 02 09:52:15 crc kubenswrapper[4722]: I1002 09:52:15.239086 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-xnjdm"] Oct 02 09:52:15 crc kubenswrapper[4722]: I1002 09:52:15.239475 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-xnjdm" podUID="812dc418-7dba-4a4c-9b9b-20eb9d801e0a" containerName="operator" containerID="cri-o://dd0e528a64dd30055aca5481b52fb487d2d1edd031e1e39e9336fedfe6148834" gracePeriod=10 Oct 02 09:52:15 crc kubenswrapper[4722]: I1002 09:52:15.577009 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-wm7rf"] Oct 02 09:52:15 crc kubenswrapper[4722]: I1002 09:52:15.577640 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-index-wm7rf" podUID="e762843d-7d09-47b6-bda1-92876c1eae97" containerName="registry-server" containerID="cri-o://61373b612f1dd62956f8396dd52c983a3695a5c001d1e851680ed820f1912537" gracePeriod=30 Oct 02 09:52:15 crc kubenswrapper[4722]: I1002 09:52:15.632072 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590btwlk"] Oct 02 09:52:15 crc kubenswrapper[4722]: I1002 09:52:15.641534 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590btwlk"] Oct 02 09:52:15 crc kubenswrapper[4722]: I1002 09:52:15.788650 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-xnjdm" Oct 02 09:52:15 crc kubenswrapper[4722]: I1002 09:52:15.922891 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqpz7\" (UniqueName: \"kubernetes.io/projected/812dc418-7dba-4a4c-9b9b-20eb9d801e0a-kube-api-access-dqpz7\") pod \"812dc418-7dba-4a4c-9b9b-20eb9d801e0a\" (UID: \"812dc418-7dba-4a4c-9b9b-20eb9d801e0a\") " Oct 02 09:52:15 crc kubenswrapper[4722]: I1002 09:52:15.929713 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/812dc418-7dba-4a4c-9b9b-20eb9d801e0a-kube-api-access-dqpz7" (OuterVolumeSpecName: "kube-api-access-dqpz7") pod "812dc418-7dba-4a4c-9b9b-20eb9d801e0a" (UID: "812dc418-7dba-4a4c-9b9b-20eb9d801e0a"). InnerVolumeSpecName "kube-api-access-dqpz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:52:15 crc kubenswrapper[4722]: I1002 09:52:15.950328 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-wm7rf" Oct 02 09:52:16 crc kubenswrapper[4722]: I1002 09:52:16.024168 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmzs4\" (UniqueName: \"kubernetes.io/projected/e762843d-7d09-47b6-bda1-92876c1eae97-kube-api-access-cmzs4\") pod \"e762843d-7d09-47b6-bda1-92876c1eae97\" (UID: \"e762843d-7d09-47b6-bda1-92876c1eae97\") " Oct 02 09:52:16 crc kubenswrapper[4722]: I1002 09:52:16.024490 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqpz7\" (UniqueName: \"kubernetes.io/projected/812dc418-7dba-4a4c-9b9b-20eb9d801e0a-kube-api-access-dqpz7\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:16 crc kubenswrapper[4722]: I1002 09:52:16.027727 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e762843d-7d09-47b6-bda1-92876c1eae97-kube-api-access-cmzs4" (OuterVolumeSpecName: "kube-api-access-cmzs4") pod "e762843d-7d09-47b6-bda1-92876c1eae97" (UID: "e762843d-7d09-47b6-bda1-92876c1eae97"). InnerVolumeSpecName "kube-api-access-cmzs4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:52:16 crc kubenswrapper[4722]: I1002 09:52:16.028170 4722 generic.go:334] "Generic (PLEG): container finished" podID="812dc418-7dba-4a4c-9b9b-20eb9d801e0a" containerID="dd0e528a64dd30055aca5481b52fb487d2d1edd031e1e39e9336fedfe6148834" exitCode=0 Oct 02 09:52:16 crc kubenswrapper[4722]: I1002 09:52:16.028217 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-xnjdm" event={"ID":"812dc418-7dba-4a4c-9b9b-20eb9d801e0a","Type":"ContainerDied","Data":"dd0e528a64dd30055aca5481b52fb487d2d1edd031e1e39e9336fedfe6148834"} Oct 02 09:52:16 crc kubenswrapper[4722]: I1002 09:52:16.028264 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-xnjdm" Oct 02 09:52:16 crc kubenswrapper[4722]: I1002 09:52:16.028347 4722 scope.go:117] "RemoveContainer" containerID="dd0e528a64dd30055aca5481b52fb487d2d1edd031e1e39e9336fedfe6148834" Oct 02 09:52:16 crc kubenswrapper[4722]: I1002 09:52:16.028588 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-xnjdm" event={"ID":"812dc418-7dba-4a4c-9b9b-20eb9d801e0a","Type":"ContainerDied","Data":"ebcea3482798ec98eff97b76a89ec30a74a1864e6b879d2dd746409a1271bf4b"} Oct 02 09:52:16 crc kubenswrapper[4722]: I1002 09:52:16.031518 4722 generic.go:334] "Generic (PLEG): container finished" podID="e762843d-7d09-47b6-bda1-92876c1eae97" containerID="61373b612f1dd62956f8396dd52c983a3695a5c001d1e851680ed820f1912537" exitCode=0 Oct 02 09:52:16 crc kubenswrapper[4722]: I1002 09:52:16.031555 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-wm7rf" event={"ID":"e762843d-7d09-47b6-bda1-92876c1eae97","Type":"ContainerDied","Data":"61373b612f1dd62956f8396dd52c983a3695a5c001d1e851680ed820f1912537"} Oct 02 09:52:16 crc kubenswrapper[4722]: I1002 09:52:16.031582 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-wm7rf" event={"ID":"e762843d-7d09-47b6-bda1-92876c1eae97","Type":"ContainerDied","Data":"0d91fb651777c47475791d6ff13cf888179c2c9368ba217c7d2b2741b61e0a0c"} Oct 02 09:52:16 crc kubenswrapper[4722]: I1002 09:52:16.031603 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-wm7rf" Oct 02 09:52:16 crc kubenswrapper[4722]: I1002 09:52:16.053663 4722 scope.go:117] "RemoveContainer" containerID="dd0e528a64dd30055aca5481b52fb487d2d1edd031e1e39e9336fedfe6148834" Oct 02 09:52:16 crc kubenswrapper[4722]: E1002 09:52:16.055427 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd0e528a64dd30055aca5481b52fb487d2d1edd031e1e39e9336fedfe6148834\": container with ID starting with dd0e528a64dd30055aca5481b52fb487d2d1edd031e1e39e9336fedfe6148834 not found: ID does not exist" containerID="dd0e528a64dd30055aca5481b52fb487d2d1edd031e1e39e9336fedfe6148834" Oct 02 09:52:16 crc kubenswrapper[4722]: I1002 09:52:16.055484 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd0e528a64dd30055aca5481b52fb487d2d1edd031e1e39e9336fedfe6148834"} err="failed to get container status \"dd0e528a64dd30055aca5481b52fb487d2d1edd031e1e39e9336fedfe6148834\": rpc error: code = NotFound desc = could not find container \"dd0e528a64dd30055aca5481b52fb487d2d1edd031e1e39e9336fedfe6148834\": container with ID starting with dd0e528a64dd30055aca5481b52fb487d2d1edd031e1e39e9336fedfe6148834 not found: ID does not exist" Oct 02 09:52:16 crc kubenswrapper[4722]: I1002 09:52:16.055519 4722 scope.go:117] "RemoveContainer" containerID="61373b612f1dd62956f8396dd52c983a3695a5c001d1e851680ed820f1912537" Oct 02 09:52:16 crc kubenswrapper[4722]: I1002 09:52:16.065959 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-xnjdm"] Oct 02 09:52:16 crc kubenswrapper[4722]: I1002 09:52:16.069939 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-xnjdm"] Oct 02 09:52:16 crc kubenswrapper[4722]: I1002 09:52:16.078778 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-wm7rf"] Oct 02 09:52:16 crc kubenswrapper[4722]: I1002 09:52:16.079687 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-wm7rf"] Oct 02 09:52:16 crc kubenswrapper[4722]: I1002 09:52:16.081835 4722 scope.go:117] "RemoveContainer" containerID="61373b612f1dd62956f8396dd52c983a3695a5c001d1e851680ed820f1912537" Oct 02 09:52:16 crc kubenswrapper[4722]: E1002 09:52:16.082579 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61373b612f1dd62956f8396dd52c983a3695a5c001d1e851680ed820f1912537\": container with ID starting with 61373b612f1dd62956f8396dd52c983a3695a5c001d1e851680ed820f1912537 not found: ID does not exist" containerID="61373b612f1dd62956f8396dd52c983a3695a5c001d1e851680ed820f1912537" Oct 02 09:52:16 crc kubenswrapper[4722]: I1002 09:52:16.082637 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61373b612f1dd62956f8396dd52c983a3695a5c001d1e851680ed820f1912537"} err="failed to get container status \"61373b612f1dd62956f8396dd52c983a3695a5c001d1e851680ed820f1912537\": rpc error: code = NotFound desc = could not find container \"61373b612f1dd62956f8396dd52c983a3695a5c001d1e851680ed820f1912537\": container with ID starting with 61373b612f1dd62956f8396dd52c983a3695a5c001d1e851680ed820f1912537 not found: ID does not exist" Oct 02 09:52:16 crc kubenswrapper[4722]: I1002 09:52:16.125841 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmzs4\" (UniqueName: \"kubernetes.io/projected/e762843d-7d09-47b6-bda1-92876c1eae97-kube-api-access-cmzs4\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:16 crc kubenswrapper[4722]: I1002 09:52:16.718592 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="812dc418-7dba-4a4c-9b9b-20eb9d801e0a" path="/var/lib/kubelet/pods/812dc418-7dba-4a4c-9b9b-20eb9d801e0a/volumes" Oct 02 09:52:16 crc kubenswrapper[4722]: I1002 09:52:16.719114 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9756f09b-bfeb-4b6e-806d-50602bdca92f" path="/var/lib/kubelet/pods/9756f09b-bfeb-4b6e-806d-50602bdca92f/volumes" Oct 02 09:52:16 crc kubenswrapper[4722]: I1002 09:52:16.719765 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e762843d-7d09-47b6-bda1-92876c1eae97" path="/var/lib/kubelet/pods/e762843d-7d09-47b6-bda1-92876c1eae97/volumes" Oct 02 09:52:18 crc kubenswrapper[4722]: I1002 09:52:18.322903 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-controller-manager-59886d58dd-rdcd7"] Oct 02 09:52:18 crc kubenswrapper[4722]: I1002 09:52:18.323122 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-controller-manager-59886d58dd-rdcd7" podUID="24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e" containerName="manager" containerID="cri-o://f6e02f3288c1a993a325db0a9f31a9109bc21c6745d48f83704bc74667165b71" gracePeriod=10 Oct 02 09:52:18 crc kubenswrapper[4722]: I1002 09:52:18.323177 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-controller-manager-59886d58dd-rdcd7" podUID="24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e" containerName="kube-rbac-proxy" containerID="cri-o://783cdf3d622013b524850950ada54253d806c06c748cb6d5806fff3f4eb65d28" gracePeriod=10 Oct 02 09:52:18 crc kubenswrapper[4722]: I1002 09:52:18.578368 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-gpr2x"] Oct 02 09:52:18 crc kubenswrapper[4722]: I1002 09:52:18.578926 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-index-gpr2x" podUID="f25cb989-7ca5-4606-bad2-9a63992d027c" containerName="registry-server" containerID="cri-o://b40189f913e11628f3bee3ffb7aa34a650424f672b04dab2440e1c8d7061951b" gracePeriod=30 Oct 02 09:52:18 crc kubenswrapper[4722]: I1002 09:52:18.635955 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/dc342c58da44971d22567f4408cd0f1817e3b92eeb179ae23b9e78aba3q2nsk"] Oct 02 09:52:18 crc kubenswrapper[4722]: I1002 09:52:18.641057 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/dc342c58da44971d22567f4408cd0f1817e3b92eeb179ae23b9e78aba3q2nsk"] Oct 02 09:52:18 crc kubenswrapper[4722]: I1002 09:52:18.726925 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e" path="/var/lib/kubelet/pods/da0e8dcd-92e0-42b9-bc5c-0580f6ba8d4e/volumes" Oct 02 09:52:18 crc kubenswrapper[4722]: I1002 09:52:18.747869 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-59886d58dd-rdcd7" Oct 02 09:52:18 crc kubenswrapper[4722]: I1002 09:52:18.863378 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e-apiservice-cert\") pod \"24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e\" (UID: \"24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e\") " Oct 02 09:52:18 crc kubenswrapper[4722]: I1002 09:52:18.863450 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e-webhook-cert\") pod \"24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e\" (UID: \"24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e\") " Oct 02 09:52:18 crc kubenswrapper[4722]: I1002 09:52:18.863506 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqfxh\" (UniqueName: \"kubernetes.io/projected/24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e-kube-api-access-bqfxh\") pod \"24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e\" (UID: \"24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e\") " Oct 02 09:52:18 crc kubenswrapper[4722]: I1002 09:52:18.886829 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e-kube-api-access-bqfxh" (OuterVolumeSpecName: "kube-api-access-bqfxh") pod "24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e" (UID: "24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e"). InnerVolumeSpecName "kube-api-access-bqfxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:52:18 crc kubenswrapper[4722]: I1002 09:52:18.886860 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e" (UID: "24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:52:18 crc kubenswrapper[4722]: I1002 09:52:18.887723 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e" (UID: "24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:52:18 crc kubenswrapper[4722]: I1002 09:52:18.951961 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-gpr2x" Oct 02 09:52:18 crc kubenswrapper[4722]: I1002 09:52:18.965932 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqfxh\" (UniqueName: \"kubernetes.io/projected/24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e-kube-api-access-bqfxh\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:18 crc kubenswrapper[4722]: I1002 09:52:18.965968 4722 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:18 crc kubenswrapper[4722]: I1002 09:52:18.965979 4722 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:19 crc kubenswrapper[4722]: I1002 09:52:19.054458 4722 generic.go:334] "Generic (PLEG): container finished" podID="f25cb989-7ca5-4606-bad2-9a63992d027c" containerID="b40189f913e11628f3bee3ffb7aa34a650424f672b04dab2440e1c8d7061951b" exitCode=0 Oct 02 09:52:19 crc kubenswrapper[4722]: I1002 09:52:19.054789 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-gpr2x" event={"ID":"f25cb989-7ca5-4606-bad2-9a63992d027c","Type":"ContainerDied","Data":"b40189f913e11628f3bee3ffb7aa34a650424f672b04dab2440e1c8d7061951b"} Oct 02 09:52:19 crc kubenswrapper[4722]: I1002 09:52:19.054815 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-gpr2x" event={"ID":"f25cb989-7ca5-4606-bad2-9a63992d027c","Type":"ContainerDied","Data":"d9fe3b3c90312acca2c0be6f0e2f3d14617d7da392e5d55bbad2e1a15fb0ab07"} Oct 02 09:52:19 crc kubenswrapper[4722]: I1002 09:52:19.054834 4722 scope.go:117] "RemoveContainer" containerID="b40189f913e11628f3bee3ffb7aa34a650424f672b04dab2440e1c8d7061951b" Oct 02 09:52:19 crc kubenswrapper[4722]: I1002 09:52:19.054843 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-gpr2x" Oct 02 09:52:19 crc kubenswrapper[4722]: I1002 09:52:19.062278 4722 generic.go:334] "Generic (PLEG): container finished" podID="24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e" containerID="783cdf3d622013b524850950ada54253d806c06c748cb6d5806fff3f4eb65d28" exitCode=0 Oct 02 09:52:19 crc kubenswrapper[4722]: I1002 09:52:19.062329 4722 generic.go:334] "Generic (PLEG): container finished" podID="24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e" containerID="f6e02f3288c1a993a325db0a9f31a9109bc21c6745d48f83704bc74667165b71" exitCode=0 Oct 02 09:52:19 crc kubenswrapper[4722]: I1002 09:52:19.062353 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-59886d58dd-rdcd7" event={"ID":"24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e","Type":"ContainerDied","Data":"783cdf3d622013b524850950ada54253d806c06c748cb6d5806fff3f4eb65d28"} Oct 02 09:52:19 crc kubenswrapper[4722]: I1002 09:52:19.062382 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-59886d58dd-rdcd7" event={"ID":"24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e","Type":"ContainerDied","Data":"f6e02f3288c1a993a325db0a9f31a9109bc21c6745d48f83704bc74667165b71"} Oct 02 09:52:19 crc kubenswrapper[4722]: I1002 09:52:19.062394 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-59886d58dd-rdcd7" event={"ID":"24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e","Type":"ContainerDied","Data":"b13062f25b60e69fadf625de8cf87f55125f6de82f12f53793c77009ce014625"} Oct 02 09:52:19 crc kubenswrapper[4722]: I1002 09:52:19.062463 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-59886d58dd-rdcd7" Oct 02 09:52:19 crc kubenswrapper[4722]: I1002 09:52:19.066377 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-855cv\" (UniqueName: \"kubernetes.io/projected/f25cb989-7ca5-4606-bad2-9a63992d027c-kube-api-access-855cv\") pod \"f25cb989-7ca5-4606-bad2-9a63992d027c\" (UID: \"f25cb989-7ca5-4606-bad2-9a63992d027c\") " Oct 02 09:52:19 crc kubenswrapper[4722]: I1002 09:52:19.071463 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f25cb989-7ca5-4606-bad2-9a63992d027c-kube-api-access-855cv" (OuterVolumeSpecName: "kube-api-access-855cv") pod "f25cb989-7ca5-4606-bad2-9a63992d027c" (UID: "f25cb989-7ca5-4606-bad2-9a63992d027c"). InnerVolumeSpecName "kube-api-access-855cv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:52:19 crc kubenswrapper[4722]: I1002 09:52:19.077959 4722 scope.go:117] "RemoveContainer" containerID="b40189f913e11628f3bee3ffb7aa34a650424f672b04dab2440e1c8d7061951b" Oct 02 09:52:19 crc kubenswrapper[4722]: E1002 09:52:19.078359 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b40189f913e11628f3bee3ffb7aa34a650424f672b04dab2440e1c8d7061951b\": container with ID starting with b40189f913e11628f3bee3ffb7aa34a650424f672b04dab2440e1c8d7061951b not found: ID does not exist" containerID="b40189f913e11628f3bee3ffb7aa34a650424f672b04dab2440e1c8d7061951b" Oct 02 09:52:19 crc kubenswrapper[4722]: I1002 09:52:19.078454 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b40189f913e11628f3bee3ffb7aa34a650424f672b04dab2440e1c8d7061951b"} err="failed to get container status \"b40189f913e11628f3bee3ffb7aa34a650424f672b04dab2440e1c8d7061951b\": rpc error: code = NotFound desc = could not find container \"b40189f913e11628f3bee3ffb7aa34a650424f672b04dab2440e1c8d7061951b\": container with ID starting with b40189f913e11628f3bee3ffb7aa34a650424f672b04dab2440e1c8d7061951b not found: ID does not exist" Oct 02 09:52:19 crc kubenswrapper[4722]: I1002 09:52:19.078482 4722 scope.go:117] "RemoveContainer" containerID="783cdf3d622013b524850950ada54253d806c06c748cb6d5806fff3f4eb65d28" Oct 02 09:52:19 crc kubenswrapper[4722]: I1002 09:52:19.095491 4722 scope.go:117] "RemoveContainer" containerID="f6e02f3288c1a993a325db0a9f31a9109bc21c6745d48f83704bc74667165b71" Oct 02 09:52:19 crc kubenswrapper[4722]: I1002 09:52:19.099841 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-controller-manager-59886d58dd-rdcd7"] Oct 02 09:52:19 crc kubenswrapper[4722]: I1002 09:52:19.103422 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-controller-manager-59886d58dd-rdcd7"] Oct 02 09:52:19 crc kubenswrapper[4722]: I1002 09:52:19.109214 4722 scope.go:117] "RemoveContainer" containerID="783cdf3d622013b524850950ada54253d806c06c748cb6d5806fff3f4eb65d28" Oct 02 09:52:19 crc kubenswrapper[4722]: E1002 09:52:19.109740 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"783cdf3d622013b524850950ada54253d806c06c748cb6d5806fff3f4eb65d28\": container with ID starting with 783cdf3d622013b524850950ada54253d806c06c748cb6d5806fff3f4eb65d28 not found: ID does not exist" containerID="783cdf3d622013b524850950ada54253d806c06c748cb6d5806fff3f4eb65d28" Oct 02 09:52:19 crc kubenswrapper[4722]: I1002 09:52:19.109779 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"783cdf3d622013b524850950ada54253d806c06c748cb6d5806fff3f4eb65d28"} err="failed to get container status \"783cdf3d622013b524850950ada54253d806c06c748cb6d5806fff3f4eb65d28\": rpc error: code = NotFound desc = could not find container \"783cdf3d622013b524850950ada54253d806c06c748cb6d5806fff3f4eb65d28\": container with ID starting with 783cdf3d622013b524850950ada54253d806c06c748cb6d5806fff3f4eb65d28 not found: ID does not exist" Oct 02 09:52:19 crc kubenswrapper[4722]: I1002 09:52:19.109806 4722 scope.go:117] "RemoveContainer" containerID="f6e02f3288c1a993a325db0a9f31a9109bc21c6745d48f83704bc74667165b71" Oct 02 09:52:19 crc kubenswrapper[4722]: E1002 09:52:19.110263 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6e02f3288c1a993a325db0a9f31a9109bc21c6745d48f83704bc74667165b71\": container with ID starting with f6e02f3288c1a993a325db0a9f31a9109bc21c6745d48f83704bc74667165b71 not found: ID does not exist" containerID="f6e02f3288c1a993a325db0a9f31a9109bc21c6745d48f83704bc74667165b71" Oct 02 09:52:19 crc kubenswrapper[4722]: I1002 09:52:19.110328 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6e02f3288c1a993a325db0a9f31a9109bc21c6745d48f83704bc74667165b71"} err="failed to get container status \"f6e02f3288c1a993a325db0a9f31a9109bc21c6745d48f83704bc74667165b71\": rpc error: code = NotFound desc = could not find container \"f6e02f3288c1a993a325db0a9f31a9109bc21c6745d48f83704bc74667165b71\": container with ID starting with f6e02f3288c1a993a325db0a9f31a9109bc21c6745d48f83704bc74667165b71 not found: ID does not exist" Oct 02 09:52:19 crc kubenswrapper[4722]: I1002 09:52:19.110414 4722 scope.go:117] "RemoveContainer" containerID="783cdf3d622013b524850950ada54253d806c06c748cb6d5806fff3f4eb65d28" Oct 02 09:52:19 crc kubenswrapper[4722]: I1002 09:52:19.110857 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"783cdf3d622013b524850950ada54253d806c06c748cb6d5806fff3f4eb65d28"} err="failed to get container status \"783cdf3d622013b524850950ada54253d806c06c748cb6d5806fff3f4eb65d28\": rpc error: code = NotFound desc = could not find container \"783cdf3d622013b524850950ada54253d806c06c748cb6d5806fff3f4eb65d28\": container with ID starting with 783cdf3d622013b524850950ada54253d806c06c748cb6d5806fff3f4eb65d28 not found: ID does not exist" Oct 02 09:52:19 crc kubenswrapper[4722]: I1002 09:52:19.110963 4722 scope.go:117] "RemoveContainer" containerID="f6e02f3288c1a993a325db0a9f31a9109bc21c6745d48f83704bc74667165b71" Oct 02 09:52:19 crc kubenswrapper[4722]: I1002 09:52:19.111295 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6e02f3288c1a993a325db0a9f31a9109bc21c6745d48f83704bc74667165b71"} err="failed to get container status \"f6e02f3288c1a993a325db0a9f31a9109bc21c6745d48f83704bc74667165b71\": rpc error: code = NotFound desc = could not find container \"f6e02f3288c1a993a325db0a9f31a9109bc21c6745d48f83704bc74667165b71\": container with ID starting with f6e02f3288c1a993a325db0a9f31a9109bc21c6745d48f83704bc74667165b71 not found: ID does not exist" Oct 02 09:52:19 crc kubenswrapper[4722]: I1002 09:52:19.168080 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-855cv\" (UniqueName: \"kubernetes.io/projected/f25cb989-7ca5-4606-bad2-9a63992d027c-kube-api-access-855cv\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:19 crc kubenswrapper[4722]: I1002 09:52:19.408194 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-gpr2x"] Oct 02 09:52:19 crc kubenswrapper[4722]: I1002 09:52:19.412385 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-index-gpr2x"] Oct 02 09:52:19 crc kubenswrapper[4722]: I1002 09:52:19.683627 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/infra-operator-controller-manager-59886d58dd-rdcd7" podUID="24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.64:8081/readyz\": dial tcp 10.217.0.64:8081: i/o timeout (Client.Timeout exceeded while awaiting headers)" Oct 02 09:52:19 crc kubenswrapper[4722]: I1002 09:52:19.725093 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-585c4c46db-hwxpp"] Oct 02 09:52:19 crc kubenswrapper[4722]: I1002 09:52:19.725380 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-controller-manager-585c4c46db-hwxpp" podUID="afacf224-9f10-4ab3-8044-f069cd3e8445" containerName="manager" containerID="cri-o://b598e9b8853344c23fb4cf765a38797ccbdeb8edfce92266a9bad5c2bd5b1ecd" gracePeriod=10 Oct 02 09:52:19 crc kubenswrapper[4722]: I1002 09:52:19.725536 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-controller-manager-585c4c46db-hwxpp" podUID="afacf224-9f10-4ab3-8044-f069cd3e8445" containerName="kube-rbac-proxy" containerID="cri-o://366016eee61cb7e974c42abc87061a25332351865ca9fd0f129cfde1f04b0520" gracePeriod=10 Oct 02 09:52:19 crc kubenswrapper[4722]: I1002 09:52:19.979453 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-xzsh7"] Oct 02 09:52:19 crc kubenswrapper[4722]: I1002 09:52:19.979654 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-xzsh7" podUID="dd2b9c64-b51a-46b6-bbc7-c9289da3773a" containerName="registry-server" containerID="cri-o://b1b0da16a7af0cce8d10414c5a9b4867483e59be4a799e7db0ee30736d01e171" gracePeriod=30 Oct 02 09:52:20 crc kubenswrapper[4722]: I1002 09:52:20.054369 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7er96qt"] Oct 02 09:52:20 crc kubenswrapper[4722]: I1002 09:52:20.061392 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7er96qt"] Oct 02 09:52:20 crc kubenswrapper[4722]: I1002 09:52:20.100486 4722 generic.go:334] "Generic (PLEG): container finished" podID="afacf224-9f10-4ab3-8044-f069cd3e8445" containerID="366016eee61cb7e974c42abc87061a25332351865ca9fd0f129cfde1f04b0520" exitCode=0 Oct 02 09:52:20 crc kubenswrapper[4722]: I1002 09:52:20.100522 4722 generic.go:334] "Generic (PLEG): container finished" podID="afacf224-9f10-4ab3-8044-f069cd3e8445" containerID="b598e9b8853344c23fb4cf765a38797ccbdeb8edfce92266a9bad5c2bd5b1ecd" exitCode=0 Oct 02 09:52:20 crc kubenswrapper[4722]: I1002 09:52:20.100568 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-585c4c46db-hwxpp" event={"ID":"afacf224-9f10-4ab3-8044-f069cd3e8445","Type":"ContainerDied","Data":"366016eee61cb7e974c42abc87061a25332351865ca9fd0f129cfde1f04b0520"} Oct 02 09:52:20 crc kubenswrapper[4722]: I1002 09:52:20.100626 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-585c4c46db-hwxpp" event={"ID":"afacf224-9f10-4ab3-8044-f069cd3e8445","Type":"ContainerDied","Data":"b598e9b8853344c23fb4cf765a38797ccbdeb8edfce92266a9bad5c2bd5b1ecd"} Oct 02 09:52:20 crc kubenswrapper[4722]: I1002 09:52:20.255056 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-585c4c46db-hwxpp" Oct 02 09:52:20 crc kubenswrapper[4722]: E1002 09:52:20.347250 4722 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b1b0da16a7af0cce8d10414c5a9b4867483e59be4a799e7db0ee30736d01e171 is running failed: container process not found" containerID="b1b0da16a7af0cce8d10414c5a9b4867483e59be4a799e7db0ee30736d01e171" cmd=["grpc_health_probe","-addr=:50051"] Oct 02 09:52:20 crc kubenswrapper[4722]: E1002 09:52:20.347838 4722 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b1b0da16a7af0cce8d10414c5a9b4867483e59be4a799e7db0ee30736d01e171 is running failed: container process not found" containerID="b1b0da16a7af0cce8d10414c5a9b4867483e59be4a799e7db0ee30736d01e171" cmd=["grpc_health_probe","-addr=:50051"] Oct 02 09:52:20 crc kubenswrapper[4722]: E1002 09:52:20.348382 4722 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b1b0da16a7af0cce8d10414c5a9b4867483e59be4a799e7db0ee30736d01e171 is running failed: container process not found" containerID="b1b0da16a7af0cce8d10414c5a9b4867483e59be4a799e7db0ee30736d01e171" cmd=["grpc_health_probe","-addr=:50051"] Oct 02 09:52:20 crc kubenswrapper[4722]: E1002 09:52:20.348486 4722 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b1b0da16a7af0cce8d10414c5a9b4867483e59be4a799e7db0ee30736d01e171 is running failed: container process not found" probeType="Readiness" pod="openstack-operators/mariadb-operator-index-xzsh7" podUID="dd2b9c64-b51a-46b6-bbc7-c9289da3773a" containerName="registry-server" Oct 02 09:52:20 crc kubenswrapper[4722]: I1002 09:52:20.387521 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwkj4\" (UniqueName: \"kubernetes.io/projected/afacf224-9f10-4ab3-8044-f069cd3e8445-kube-api-access-nwkj4\") pod \"afacf224-9f10-4ab3-8044-f069cd3e8445\" (UID: \"afacf224-9f10-4ab3-8044-f069cd3e8445\") " Oct 02 09:52:20 crc kubenswrapper[4722]: I1002 09:52:20.387796 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/afacf224-9f10-4ab3-8044-f069cd3e8445-apiservice-cert\") pod \"afacf224-9f10-4ab3-8044-f069cd3e8445\" (UID: \"afacf224-9f10-4ab3-8044-f069cd3e8445\") " Oct 02 09:52:20 crc kubenswrapper[4722]: I1002 09:52:20.387903 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/afacf224-9f10-4ab3-8044-f069cd3e8445-webhook-cert\") pod \"afacf224-9f10-4ab3-8044-f069cd3e8445\" (UID: \"afacf224-9f10-4ab3-8044-f069cd3e8445\") " Oct 02 09:52:20 crc kubenswrapper[4722]: I1002 09:52:20.393480 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afacf224-9f10-4ab3-8044-f069cd3e8445-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "afacf224-9f10-4ab3-8044-f069cd3e8445" (UID: "afacf224-9f10-4ab3-8044-f069cd3e8445"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:52:20 crc kubenswrapper[4722]: I1002 09:52:20.393497 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afacf224-9f10-4ab3-8044-f069cd3e8445-kube-api-access-nwkj4" (OuterVolumeSpecName: "kube-api-access-nwkj4") pod "afacf224-9f10-4ab3-8044-f069cd3e8445" (UID: "afacf224-9f10-4ab3-8044-f069cd3e8445"). InnerVolumeSpecName "kube-api-access-nwkj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:52:20 crc kubenswrapper[4722]: I1002 09:52:20.393795 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afacf224-9f10-4ab3-8044-f069cd3e8445-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "afacf224-9f10-4ab3-8044-f069cd3e8445" (UID: "afacf224-9f10-4ab3-8044-f069cd3e8445"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 09:52:20 crc kubenswrapper[4722]: I1002 09:52:20.489918 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwkj4\" (UniqueName: \"kubernetes.io/projected/afacf224-9f10-4ab3-8044-f069cd3e8445-kube-api-access-nwkj4\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:20 crc kubenswrapper[4722]: I1002 09:52:20.489969 4722 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/afacf224-9f10-4ab3-8044-f069cd3e8445-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:20 crc kubenswrapper[4722]: I1002 09:52:20.489979 4722 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/afacf224-9f10-4ab3-8044-f069cd3e8445-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:20 crc kubenswrapper[4722]: I1002 09:52:20.717216 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e" path="/var/lib/kubelet/pods/24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e/volumes" Oct 02 09:52:20 crc kubenswrapper[4722]: I1002 09:52:20.718048 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="550a0f2b-ce49-4428-b4c3-79183095ec89" path="/var/lib/kubelet/pods/550a0f2b-ce49-4428-b4c3-79183095ec89/volumes" Oct 02 09:52:20 crc kubenswrapper[4722]: I1002 09:52:20.718633 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f25cb989-7ca5-4606-bad2-9a63992d027c" path="/var/lib/kubelet/pods/f25cb989-7ca5-4606-bad2-9a63992d027c/volumes" Oct 02 09:52:20 crc kubenswrapper[4722]: I1002 09:52:20.889892 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-xzsh7" Oct 02 09:52:20 crc kubenswrapper[4722]: I1002 09:52:20.996942 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrzdd\" (UniqueName: \"kubernetes.io/projected/dd2b9c64-b51a-46b6-bbc7-c9289da3773a-kube-api-access-xrzdd\") pod \"dd2b9c64-b51a-46b6-bbc7-c9289da3773a\" (UID: \"dd2b9c64-b51a-46b6-bbc7-c9289da3773a\") " Oct 02 09:52:20 crc kubenswrapper[4722]: I1002 09:52:20.999587 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd2b9c64-b51a-46b6-bbc7-c9289da3773a-kube-api-access-xrzdd" (OuterVolumeSpecName: "kube-api-access-xrzdd") pod "dd2b9c64-b51a-46b6-bbc7-c9289da3773a" (UID: "dd2b9c64-b51a-46b6-bbc7-c9289da3773a"). InnerVolumeSpecName "kube-api-access-xrzdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:52:21 crc kubenswrapper[4722]: I1002 09:52:21.098779 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrzdd\" (UniqueName: \"kubernetes.io/projected/dd2b9c64-b51a-46b6-bbc7-c9289da3773a-kube-api-access-xrzdd\") on node \"crc\" DevicePath \"\"" Oct 02 09:52:21 crc kubenswrapper[4722]: I1002 09:52:21.125366 4722 generic.go:334] "Generic (PLEG): container finished" podID="dd2b9c64-b51a-46b6-bbc7-c9289da3773a" containerID="b1b0da16a7af0cce8d10414c5a9b4867483e59be4a799e7db0ee30736d01e171" exitCode=0 Oct 02 09:52:21 crc kubenswrapper[4722]: I1002 09:52:21.125420 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-xzsh7" Oct 02 09:52:21 crc kubenswrapper[4722]: I1002 09:52:21.125465 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-xzsh7" event={"ID":"dd2b9c64-b51a-46b6-bbc7-c9289da3773a","Type":"ContainerDied","Data":"b1b0da16a7af0cce8d10414c5a9b4867483e59be4a799e7db0ee30736d01e171"} Oct 02 09:52:21 crc kubenswrapper[4722]: I1002 09:52:21.125505 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-xzsh7" event={"ID":"dd2b9c64-b51a-46b6-bbc7-c9289da3773a","Type":"ContainerDied","Data":"2349958cf755ae9429adc5951a4d5decf409861d5a4e9ea5fdc2b85eafd0b0ae"} Oct 02 09:52:21 crc kubenswrapper[4722]: I1002 09:52:21.125529 4722 scope.go:117] "RemoveContainer" containerID="b1b0da16a7af0cce8d10414c5a9b4867483e59be4a799e7db0ee30736d01e171" Oct 02 09:52:21 crc kubenswrapper[4722]: I1002 09:52:21.128077 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-585c4c46db-hwxpp" event={"ID":"afacf224-9f10-4ab3-8044-f069cd3e8445","Type":"ContainerDied","Data":"aa34f5af3cab89b490a8f2bb8a11a0ea565cdca73c1534988edf87fd77b94937"} Oct 02 09:52:21 crc kubenswrapper[4722]: I1002 09:52:21.128250 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-585c4c46db-hwxpp" Oct 02 09:52:21 crc kubenswrapper[4722]: I1002 09:52:21.145998 4722 scope.go:117] "RemoveContainer" containerID="b1b0da16a7af0cce8d10414c5a9b4867483e59be4a799e7db0ee30736d01e171" Oct 02 09:52:21 crc kubenswrapper[4722]: E1002 09:52:21.146540 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1b0da16a7af0cce8d10414c5a9b4867483e59be4a799e7db0ee30736d01e171\": container with ID starting with b1b0da16a7af0cce8d10414c5a9b4867483e59be4a799e7db0ee30736d01e171 not found: ID does not exist" containerID="b1b0da16a7af0cce8d10414c5a9b4867483e59be4a799e7db0ee30736d01e171" Oct 02 09:52:21 crc kubenswrapper[4722]: I1002 09:52:21.146566 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1b0da16a7af0cce8d10414c5a9b4867483e59be4a799e7db0ee30736d01e171"} err="failed to get container status \"b1b0da16a7af0cce8d10414c5a9b4867483e59be4a799e7db0ee30736d01e171\": rpc error: code = NotFound desc = could not find container \"b1b0da16a7af0cce8d10414c5a9b4867483e59be4a799e7db0ee30736d01e171\": container with ID starting with b1b0da16a7af0cce8d10414c5a9b4867483e59be4a799e7db0ee30736d01e171 not found: ID does not exist" Oct 02 09:52:21 crc kubenswrapper[4722]: I1002 09:52:21.146584 4722 scope.go:117] "RemoveContainer" containerID="366016eee61cb7e974c42abc87061a25332351865ca9fd0f129cfde1f04b0520" Oct 02 09:52:21 crc kubenswrapper[4722]: I1002 09:52:21.162784 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-585c4c46db-hwxpp"] Oct 02 09:52:21 crc kubenswrapper[4722]: I1002 09:52:21.165952 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-585c4c46db-hwxpp"] Oct 02 09:52:21 crc kubenswrapper[4722]: I1002 09:52:21.171053 4722 scope.go:117] "RemoveContainer" containerID="b598e9b8853344c23fb4cf765a38797ccbdeb8edfce92266a9bad5c2bd5b1ecd" Oct 02 09:52:21 crc kubenswrapper[4722]: I1002 09:52:21.172542 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-xzsh7"] Oct 02 09:52:21 crc kubenswrapper[4722]: I1002 09:52:21.181463 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-xzsh7"] Oct 02 09:52:22 crc kubenswrapper[4722]: I1002 09:52:22.724524 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afacf224-9f10-4ab3-8044-f069cd3e8445" path="/var/lib/kubelet/pods/afacf224-9f10-4ab3-8044-f069cd3e8445/volumes" Oct 02 09:52:22 crc kubenswrapper[4722]: I1002 09:52:22.727197 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd2b9c64-b51a-46b6-bbc7-c9289da3773a" path="/var/lib/kubelet/pods/dd2b9c64-b51a-46b6-bbc7-c9289da3773a/volumes" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.242487 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ddsj9/must-gather-kfcc8"] Oct 02 09:52:29 crc kubenswrapper[4722]: E1002 09:52:29.243261 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afacf224-9f10-4ab3-8044-f069cd3e8445" containerName="kube-rbac-proxy" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.243274 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="afacf224-9f10-4ab3-8044-f069cd3e8445" containerName="kube-rbac-proxy" Oct 02 09:52:29 crc kubenswrapper[4722]: E1002 09:52:29.243283 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e09d4bc8-436e-4faa-9416-3d76a569c97e" containerName="registry-server" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.243289 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e09d4bc8-436e-4faa-9416-3d76a569c97e" containerName="registry-server" Oct 02 09:52:29 crc kubenswrapper[4722]: E1002 09:52:29.243301 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81c18fc8-43c0-4993-b297-c158ba94f076" containerName="rabbitmq" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.243307 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="81c18fc8-43c0-4993-b297-c158ba94f076" containerName="rabbitmq" Oct 02 09:52:29 crc kubenswrapper[4722]: E1002 09:52:29.243334 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81c18fc8-43c0-4993-b297-c158ba94f076" containerName="setup-container" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.243340 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="81c18fc8-43c0-4993-b297-c158ba94f076" containerName="setup-container" Oct 02 09:52:29 crc kubenswrapper[4722]: E1002 09:52:29.243348 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df9a46f7-6264-4bc4-b5f2-87a84a38a06c" containerName="manager" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.243354 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="df9a46f7-6264-4bc4-b5f2-87a84a38a06c" containerName="manager" Oct 02 09:52:29 crc kubenswrapper[4722]: E1002 09:52:29.243365 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e" containerName="manager" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.243371 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e" containerName="manager" Oct 02 09:52:29 crc kubenswrapper[4722]: E1002 09:52:29.243380 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9103aa77-3c03-4ece-8293-b6ccd66a0cbb" containerName="galera" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.243386 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9103aa77-3c03-4ece-8293-b6ccd66a0cbb" containerName="galera" Oct 02 09:52:29 crc kubenswrapper[4722]: E1002 09:52:29.243393 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4383f5f8-eb32-4c6b-a3a5-0943a567e856" containerName="galera" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.243399 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4383f5f8-eb32-4c6b-a3a5-0943a567e856" containerName="galera" Oct 02 09:52:29 crc kubenswrapper[4722]: E1002 09:52:29.243405 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5efc7dc-5ee4-4e70-ad78-acd076852fcb" containerName="registry-server" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.243411 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5efc7dc-5ee4-4e70-ad78-acd076852fcb" containerName="registry-server" Oct 02 09:52:29 crc kubenswrapper[4722]: E1002 09:52:29.243416 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4d42537-11aa-4a12-8137-feca0a8d889b" containerName="memcached" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.243422 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4d42537-11aa-4a12-8137-feca0a8d889b" containerName="memcached" Oct 02 09:52:29 crc kubenswrapper[4722]: E1002 09:52:29.243429 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e" containerName="kube-rbac-proxy" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.243434 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e" containerName="kube-rbac-proxy" Oct 02 09:52:29 crc kubenswrapper[4722]: E1002 09:52:29.243444 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff0e8efc-3b56-4f5e-9944-9e34a644c05e" containerName="manager" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.243450 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff0e8efc-3b56-4f5e-9944-9e34a644c05e" containerName="manager" Oct 02 09:52:29 crc kubenswrapper[4722]: E1002 09:52:29.243457 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9103aa77-3c03-4ece-8293-b6ccd66a0cbb" containerName="mysql-bootstrap" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.243463 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9103aa77-3c03-4ece-8293-b6ccd66a0cbb" containerName="mysql-bootstrap" Oct 02 09:52:29 crc kubenswrapper[4722]: E1002 09:52:29.243469 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cb4f813-1f97-4272-943d-05fa71c599dc" containerName="galera" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.243475 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cb4f813-1f97-4272-943d-05fa71c599dc" containerName="galera" Oct 02 09:52:29 crc kubenswrapper[4722]: E1002 09:52:29.243485 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cb4f813-1f97-4272-943d-05fa71c599dc" containerName="mysql-bootstrap" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.243491 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cb4f813-1f97-4272-943d-05fa71c599dc" containerName="mysql-bootstrap" Oct 02 09:52:29 crc kubenswrapper[4722]: E1002 09:52:29.243501 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f25cb989-7ca5-4606-bad2-9a63992d027c" containerName="registry-server" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.243507 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f25cb989-7ca5-4606-bad2-9a63992d027c" containerName="registry-server" Oct 02 09:52:29 crc kubenswrapper[4722]: E1002 09:52:29.243516 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e762843d-7d09-47b6-bda1-92876c1eae97" containerName="registry-server" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.243522 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e762843d-7d09-47b6-bda1-92876c1eae97" containerName="registry-server" Oct 02 09:52:29 crc kubenswrapper[4722]: E1002 09:52:29.243528 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4383f5f8-eb32-4c6b-a3a5-0943a567e856" containerName="mysql-bootstrap" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.243534 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4383f5f8-eb32-4c6b-a3a5-0943a567e856" containerName="mysql-bootstrap" Oct 02 09:52:29 crc kubenswrapper[4722]: E1002 09:52:29.243543 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afacf224-9f10-4ab3-8044-f069cd3e8445" containerName="manager" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.243548 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="afacf224-9f10-4ab3-8044-f069cd3e8445" containerName="manager" Oct 02 09:52:29 crc kubenswrapper[4722]: E1002 09:52:29.243557 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df9a46f7-6264-4bc4-b5f2-87a84a38a06c" containerName="kube-rbac-proxy" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.243562 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="df9a46f7-6264-4bc4-b5f2-87a84a38a06c" containerName="kube-rbac-proxy" Oct 02 09:52:29 crc kubenswrapper[4722]: E1002 09:52:29.243571 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="812dc418-7dba-4a4c-9b9b-20eb9d801e0a" containerName="operator" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.243576 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="812dc418-7dba-4a4c-9b9b-20eb9d801e0a" containerName="operator" Oct 02 09:52:29 crc kubenswrapper[4722]: E1002 09:52:29.243584 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff0e8efc-3b56-4f5e-9944-9e34a644c05e" containerName="kube-rbac-proxy" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.243590 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff0e8efc-3b56-4f5e-9944-9e34a644c05e" containerName="kube-rbac-proxy" Oct 02 09:52:29 crc kubenswrapper[4722]: E1002 09:52:29.243598 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8198b0a7-cf5a-41f0-a528-0f3833ff97a6" containerName="keystone-api" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.243604 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="8198b0a7-cf5a-41f0-a528-0f3833ff97a6" containerName="keystone-api" Oct 02 09:52:29 crc kubenswrapper[4722]: E1002 09:52:29.243611 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd2b9c64-b51a-46b6-bbc7-c9289da3773a" containerName="registry-server" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.243617 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd2b9c64-b51a-46b6-bbc7-c9289da3773a" containerName="registry-server" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.243761 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="812dc418-7dba-4a4c-9b9b-20eb9d801e0a" containerName="operator" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.243774 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4d42537-11aa-4a12-8137-feca0a8d889b" containerName="memcached" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.243782 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="81c18fc8-43c0-4993-b297-c158ba94f076" containerName="rabbitmq" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.243790 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="4383f5f8-eb32-4c6b-a3a5-0943a567e856" containerName="galera" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.243799 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="df9a46f7-6264-4bc4-b5f2-87a84a38a06c" containerName="kube-rbac-proxy" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.243807 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff0e8efc-3b56-4f5e-9944-9e34a644c05e" containerName="kube-rbac-proxy" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.243814 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="9103aa77-3c03-4ece-8293-b6ccd66a0cbb" containerName="galera" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.243820 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="8198b0a7-cf5a-41f0-a528-0f3833ff97a6" containerName="keystone-api" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.243829 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd2b9c64-b51a-46b6-bbc7-c9289da3773a" containerName="registry-server" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.243837 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e09d4bc8-436e-4faa-9416-3d76a569c97e" containerName="registry-server" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.243844 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5efc7dc-5ee4-4e70-ad78-acd076852fcb" containerName="registry-server" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.243850 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff0e8efc-3b56-4f5e-9944-9e34a644c05e" containerName="manager" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.243873 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e762843d-7d09-47b6-bda1-92876c1eae97" containerName="registry-server" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.243882 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="df9a46f7-6264-4bc4-b5f2-87a84a38a06c" containerName="manager" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.243895 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e" containerName="kube-rbac-proxy" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.243925 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="afacf224-9f10-4ab3-8044-f069cd3e8445" containerName="kube-rbac-proxy" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.243934 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f25cb989-7ca5-4606-bad2-9a63992d027c" containerName="registry-server" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.243942 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="afacf224-9f10-4ab3-8044-f069cd3e8445" containerName="manager" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.243950 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cb4f813-1f97-4272-943d-05fa71c599dc" containerName="galera" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.243956 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="24fe74dd-17d5-40ca-9bb5-7dbc46f8f38e" containerName="manager" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.244655 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ddsj9/must-gather-kfcc8" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.247779 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-ddsj9"/"openshift-service-ca.crt" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.247852 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-ddsj9"/"kube-root-ca.crt" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.247890 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-ddsj9"/"default-dockercfg-fpmlw" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.253411 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ddsj9/must-gather-kfcc8"] Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.315348 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjhpv\" (UniqueName: \"kubernetes.io/projected/aec8a171-5018-4091-8875-4496093fcfee-kube-api-access-vjhpv\") pod \"must-gather-kfcc8\" (UID: \"aec8a171-5018-4091-8875-4496093fcfee\") " pod="openshift-must-gather-ddsj9/must-gather-kfcc8" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.315415 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/aec8a171-5018-4091-8875-4496093fcfee-must-gather-output\") pod \"must-gather-kfcc8\" (UID: \"aec8a171-5018-4091-8875-4496093fcfee\") " pod="openshift-must-gather-ddsj9/must-gather-kfcc8" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.416154 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjhpv\" (UniqueName: \"kubernetes.io/projected/aec8a171-5018-4091-8875-4496093fcfee-kube-api-access-vjhpv\") pod \"must-gather-kfcc8\" (UID: \"aec8a171-5018-4091-8875-4496093fcfee\") " pod="openshift-must-gather-ddsj9/must-gather-kfcc8" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.416222 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/aec8a171-5018-4091-8875-4496093fcfee-must-gather-output\") pod \"must-gather-kfcc8\" (UID: \"aec8a171-5018-4091-8875-4496093fcfee\") " pod="openshift-must-gather-ddsj9/must-gather-kfcc8" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.416653 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/aec8a171-5018-4091-8875-4496093fcfee-must-gather-output\") pod \"must-gather-kfcc8\" (UID: \"aec8a171-5018-4091-8875-4496093fcfee\") " pod="openshift-must-gather-ddsj9/must-gather-kfcc8" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.435117 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjhpv\" (UniqueName: \"kubernetes.io/projected/aec8a171-5018-4091-8875-4496093fcfee-kube-api-access-vjhpv\") pod \"must-gather-kfcc8\" (UID: \"aec8a171-5018-4091-8875-4496093fcfee\") " pod="openshift-must-gather-ddsj9/must-gather-kfcc8" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.559795 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ddsj9/must-gather-kfcc8" Oct 02 09:52:29 crc kubenswrapper[4722]: I1002 09:52:29.978255 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ddsj9/must-gather-kfcc8"] Oct 02 09:52:30 crc kubenswrapper[4722]: I1002 09:52:30.192400 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ddsj9/must-gather-kfcc8" event={"ID":"aec8a171-5018-4091-8875-4496093fcfee","Type":"ContainerStarted","Data":"0c79dde129cafeebc2487a969c9e56b5c2a77fe6f79b25cb13345511b4ed677a"} Oct 02 09:52:35 crc kubenswrapper[4722]: I1002 09:52:35.222936 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ddsj9/must-gather-kfcc8" event={"ID":"aec8a171-5018-4091-8875-4496093fcfee","Type":"ContainerStarted","Data":"5462d902b828085eb4de02458ce5e40494bb752a8ae1a31ff9e580f368575cbb"} Oct 02 09:52:35 crc kubenswrapper[4722]: I1002 09:52:35.223481 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ddsj9/must-gather-kfcc8" event={"ID":"aec8a171-5018-4091-8875-4496093fcfee","Type":"ContainerStarted","Data":"9382a37dc67b003dee38a7eb272d01333a1c9a6a2e0e094d1fd5144e30b032cf"} Oct 02 09:52:35 crc kubenswrapper[4722]: I1002 09:52:35.248740 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ddsj9/must-gather-kfcc8" podStartSLOduration=2.209721149 podStartE2EDuration="6.248722204s" podCreationTimestamp="2025-10-02 09:52:29 +0000 UTC" firstStartedPulling="2025-10-02 09:52:30.000135668 +0000 UTC m=+1206.036888658" lastFinishedPulling="2025-10-02 09:52:34.039136733 +0000 UTC m=+1210.075889713" observedRunningTime="2025-10-02 09:52:35.243250053 +0000 UTC m=+1211.280003033" watchObservedRunningTime="2025-10-02 09:52:35.248722204 +0000 UTC m=+1211.285475184" Oct 02 09:52:40 crc kubenswrapper[4722]: I1002 09:52:40.696585 4722 patch_prober.go:28] interesting pod/machine-config-daemon-q6h76 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 09:52:40 crc kubenswrapper[4722]: I1002 09:52:40.697198 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" podUID="f662826c-e772-4f56-a85f-83971aaef356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 09:53:10 crc kubenswrapper[4722]: I1002 09:53:10.696889 4722 patch_prober.go:28] interesting pod/machine-config-daemon-q6h76 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 09:53:10 crc kubenswrapper[4722]: I1002 09:53:10.697565 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" podUID="f662826c-e772-4f56-a85f-83971aaef356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 09:53:10 crc kubenswrapper[4722]: I1002 09:53:10.697614 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" Oct 02 09:53:10 crc kubenswrapper[4722]: I1002 09:53:10.698177 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c62ee09a3389642011d46d3c51dac1ea6cb21564e2a50b3f560683d644ff19e5"} pod="openshift-machine-config-operator/machine-config-daemon-q6h76" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 09:53:10 crc kubenswrapper[4722]: I1002 09:53:10.698227 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" podUID="f662826c-e772-4f56-a85f-83971aaef356" containerName="machine-config-daemon" containerID="cri-o://c62ee09a3389642011d46d3c51dac1ea6cb21564e2a50b3f560683d644ff19e5" gracePeriod=600 Oct 02 09:53:11 crc kubenswrapper[4722]: I1002 09:53:11.464586 4722 generic.go:334] "Generic (PLEG): container finished" podID="f662826c-e772-4f56-a85f-83971aaef356" containerID="c62ee09a3389642011d46d3c51dac1ea6cb21564e2a50b3f560683d644ff19e5" exitCode=0 Oct 02 09:53:11 crc kubenswrapper[4722]: I1002 09:53:11.464627 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" event={"ID":"f662826c-e772-4f56-a85f-83971aaef356","Type":"ContainerDied","Data":"c62ee09a3389642011d46d3c51dac1ea6cb21564e2a50b3f560683d644ff19e5"} Oct 02 09:53:11 crc kubenswrapper[4722]: I1002 09:53:11.465569 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" event={"ID":"f662826c-e772-4f56-a85f-83971aaef356","Type":"ContainerStarted","Data":"1e9855b8e1fed4be34f938fd52af7ab8c5e0e45f7054a550ded49ce27d2cb1d2"} Oct 02 09:53:11 crc kubenswrapper[4722]: I1002 09:53:11.465598 4722 scope.go:117] "RemoveContainer" containerID="424c2680b223e9db678ffaf1f8b8a1f4cab14a156d3b1044d8be0e10fbad104c" Oct 02 09:53:16 crc kubenswrapper[4722]: I1002 09:53:16.293103 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-hpfrs_bf859e5d-c395-4892-9abf-e1a9a9d21176/control-plane-machine-set-operator/0.log" Oct 02 09:53:16 crc kubenswrapper[4722]: I1002 09:53:16.495740 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-krj4j_86bc5bcd-672b-4bb6-8090-1ed2d77fff50/kube-rbac-proxy/0.log" Oct 02 09:53:16 crc kubenswrapper[4722]: I1002 09:53:16.504671 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-krj4j_86bc5bcd-672b-4bb6-8090-1ed2d77fff50/machine-api-operator/0.log" Oct 02 09:53:25 crc kubenswrapper[4722]: I1002 09:53:25.446589 4722 scope.go:117] "RemoveContainer" containerID="b2bb7e0973d203eba9bf29734ad5a1b68a82b7b8ffeb405278438dc16e9f6cdb" Oct 02 09:53:25 crc kubenswrapper[4722]: I1002 09:53:25.468155 4722 scope.go:117] "RemoveContainer" containerID="38cce376ec42b295682284805df97e7569723579ba5e9e2202eb6a4d6ef57aa0" Oct 02 09:53:25 crc kubenswrapper[4722]: I1002 09:53:25.492535 4722 scope.go:117] "RemoveContainer" containerID="1fd0c703c189cbd400d77c05a621dc7d6e197e2e28b1974a2199babc6acdac2c" Oct 02 09:53:30 crc kubenswrapper[4722]: I1002 09:53:30.882877 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-62lfz_0b8fc405-8054-44d8-8213-089c88a2c0a4/kube-rbac-proxy/0.log" Oct 02 09:53:30 crc kubenswrapper[4722]: I1002 09:53:30.884416 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-62lfz_0b8fc405-8054-44d8-8213-089c88a2c0a4/controller/0.log" Oct 02 09:53:31 crc kubenswrapper[4722]: I1002 09:53:31.057813 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pl8wr_c20d6613-a91d-4afc-96a1-efbdffba4415/cp-frr-files/0.log" Oct 02 09:53:31 crc kubenswrapper[4722]: I1002 09:53:31.206135 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pl8wr_c20d6613-a91d-4afc-96a1-efbdffba4415/cp-metrics/0.log" Oct 02 09:53:31 crc kubenswrapper[4722]: I1002 09:53:31.234989 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pl8wr_c20d6613-a91d-4afc-96a1-efbdffba4415/cp-reloader/0.log" Oct 02 09:53:31 crc kubenswrapper[4722]: I1002 09:53:31.272791 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pl8wr_c20d6613-a91d-4afc-96a1-efbdffba4415/cp-reloader/0.log" Oct 02 09:53:31 crc kubenswrapper[4722]: I1002 09:53:31.274168 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pl8wr_c20d6613-a91d-4afc-96a1-efbdffba4415/cp-frr-files/0.log" Oct 02 09:53:31 crc kubenswrapper[4722]: I1002 09:53:31.489816 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pl8wr_c20d6613-a91d-4afc-96a1-efbdffba4415/cp-reloader/0.log" Oct 02 09:53:31 crc kubenswrapper[4722]: I1002 09:53:31.494248 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pl8wr_c20d6613-a91d-4afc-96a1-efbdffba4415/cp-frr-files/0.log" Oct 02 09:53:31 crc kubenswrapper[4722]: I1002 09:53:31.511301 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pl8wr_c20d6613-a91d-4afc-96a1-efbdffba4415/cp-metrics/0.log" Oct 02 09:53:31 crc kubenswrapper[4722]: I1002 09:53:31.522018 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pl8wr_c20d6613-a91d-4afc-96a1-efbdffba4415/cp-metrics/0.log" Oct 02 09:53:31 crc kubenswrapper[4722]: I1002 09:53:31.755396 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pl8wr_c20d6613-a91d-4afc-96a1-efbdffba4415/cp-metrics/0.log" Oct 02 09:53:31 crc kubenswrapper[4722]: I1002 09:53:31.758508 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pl8wr_c20d6613-a91d-4afc-96a1-efbdffba4415/cp-reloader/0.log" Oct 02 09:53:31 crc kubenswrapper[4722]: I1002 09:53:31.775484 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pl8wr_c20d6613-a91d-4afc-96a1-efbdffba4415/cp-frr-files/0.log" Oct 02 09:53:31 crc kubenswrapper[4722]: I1002 09:53:31.813381 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pl8wr_c20d6613-a91d-4afc-96a1-efbdffba4415/controller/0.log" Oct 02 09:53:31 crc kubenswrapper[4722]: I1002 09:53:31.973417 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pl8wr_c20d6613-a91d-4afc-96a1-efbdffba4415/frr-metrics/0.log" Oct 02 09:53:32 crc kubenswrapper[4722]: I1002 09:53:32.013383 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pl8wr_c20d6613-a91d-4afc-96a1-efbdffba4415/kube-rbac-proxy/0.log" Oct 02 09:53:32 crc kubenswrapper[4722]: I1002 09:53:32.042059 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pl8wr_c20d6613-a91d-4afc-96a1-efbdffba4415/kube-rbac-proxy-frr/0.log" Oct 02 09:53:32 crc kubenswrapper[4722]: I1002 09:53:32.158378 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pl8wr_c20d6613-a91d-4afc-96a1-efbdffba4415/reloader/0.log" Oct 02 09:53:32 crc kubenswrapper[4722]: I1002 09:53:32.295170 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-zj8n7_17b8e73d-a3a0-4f67-9634-8e7f9f6038e8/frr-k8s-webhook-server/0.log" Oct 02 09:53:32 crc kubenswrapper[4722]: I1002 09:53:32.332618 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pl8wr_c20d6613-a91d-4afc-96a1-efbdffba4415/frr/0.log" Oct 02 09:53:32 crc kubenswrapper[4722]: I1002 09:53:32.506984 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5559d9646f-85qkg_68b65273-5373-4afe-86e2-e252c7d5b8a2/manager/0.log" Oct 02 09:53:32 crc kubenswrapper[4722]: I1002 09:53:32.572541 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-546df5c677-k4c6f_cc0ccdf8-90ba-43dc-9c71-e002f1fe038c/webhook-server/0.log" Oct 02 09:53:32 crc kubenswrapper[4722]: I1002 09:53:32.711151 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-clhg2_97a5fee1-9fe2-4292-82d0-a743eff0ecbf/kube-rbac-proxy/0.log" Oct 02 09:53:32 crc kubenswrapper[4722]: I1002 09:53:32.839932 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-clhg2_97a5fee1-9fe2-4292-82d0-a743eff0ecbf/speaker/0.log" Oct 02 09:53:54 crc kubenswrapper[4722]: I1002 09:53:54.867386 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qhg8v_98e4ffb9-0cc0-4196-8f18-08f3a5827133/util/0.log" Oct 02 09:53:55 crc kubenswrapper[4722]: I1002 09:53:55.064849 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qhg8v_98e4ffb9-0cc0-4196-8f18-08f3a5827133/pull/0.log" Oct 02 09:53:55 crc kubenswrapper[4722]: I1002 09:53:55.073057 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qhg8v_98e4ffb9-0cc0-4196-8f18-08f3a5827133/util/0.log" Oct 02 09:53:55 crc kubenswrapper[4722]: I1002 09:53:55.101006 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qhg8v_98e4ffb9-0cc0-4196-8f18-08f3a5827133/pull/0.log" Oct 02 09:53:55 crc kubenswrapper[4722]: I1002 09:53:55.300364 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qhg8v_98e4ffb9-0cc0-4196-8f18-08f3a5827133/util/0.log" Oct 02 09:53:55 crc kubenswrapper[4722]: I1002 09:53:55.340065 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qhg8v_98e4ffb9-0cc0-4196-8f18-08f3a5827133/extract/0.log" Oct 02 09:53:55 crc kubenswrapper[4722]: I1002 09:53:55.362982 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qhg8v_98e4ffb9-0cc0-4196-8f18-08f3a5827133/pull/0.log" Oct 02 09:53:55 crc kubenswrapper[4722]: I1002 09:53:55.476088 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rfdss_abd654b8-cf56-4df7-840f-50f39edc1d1e/extract-utilities/0.log" Oct 02 09:53:55 crc kubenswrapper[4722]: I1002 09:53:55.648180 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rfdss_abd654b8-cf56-4df7-840f-50f39edc1d1e/extract-content/0.log" Oct 02 09:53:55 crc kubenswrapper[4722]: I1002 09:53:55.651783 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rfdss_abd654b8-cf56-4df7-840f-50f39edc1d1e/extract-content/0.log" Oct 02 09:53:55 crc kubenswrapper[4722]: I1002 09:53:55.670204 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rfdss_abd654b8-cf56-4df7-840f-50f39edc1d1e/extract-utilities/0.log" Oct 02 09:53:55 crc kubenswrapper[4722]: I1002 09:53:55.835612 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rfdss_abd654b8-cf56-4df7-840f-50f39edc1d1e/extract-content/0.log" Oct 02 09:53:55 crc kubenswrapper[4722]: I1002 09:53:55.841366 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rfdss_abd654b8-cf56-4df7-840f-50f39edc1d1e/extract-utilities/0.log" Oct 02 09:53:56 crc kubenswrapper[4722]: I1002 09:53:56.105436 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wcbff_1aba904b-b2dc-4eaf-b466-36c73f90de8a/extract-utilities/0.log" Oct 02 09:53:56 crc kubenswrapper[4722]: I1002 09:53:56.189035 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rfdss_abd654b8-cf56-4df7-840f-50f39edc1d1e/registry-server/0.log" Oct 02 09:53:56 crc kubenswrapper[4722]: I1002 09:53:56.235412 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wcbff_1aba904b-b2dc-4eaf-b466-36c73f90de8a/extract-utilities/0.log" Oct 02 09:53:56 crc kubenswrapper[4722]: I1002 09:53:56.264566 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wcbff_1aba904b-b2dc-4eaf-b466-36c73f90de8a/extract-content/0.log" Oct 02 09:53:56 crc kubenswrapper[4722]: I1002 09:53:56.432686 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wcbff_1aba904b-b2dc-4eaf-b466-36c73f90de8a/extract-content/0.log" Oct 02 09:53:56 crc kubenswrapper[4722]: I1002 09:53:56.569609 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wcbff_1aba904b-b2dc-4eaf-b466-36c73f90de8a/extract-content/0.log" Oct 02 09:53:56 crc kubenswrapper[4722]: I1002 09:53:56.580015 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wcbff_1aba904b-b2dc-4eaf-b466-36c73f90de8a/extract-utilities/0.log" Oct 02 09:53:56 crc kubenswrapper[4722]: I1002 09:53:56.818513 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-nkmjs_bdbbc0cf-ff76-4e3a-903d-e1dff4fb7ba8/marketplace-operator/0.log" Oct 02 09:53:56 crc kubenswrapper[4722]: I1002 09:53:56.843834 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pc9sc_f7eec357-442f-49c5-a539-067608f33f4f/extract-utilities/0.log" Oct 02 09:53:56 crc kubenswrapper[4722]: I1002 09:53:56.918206 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wcbff_1aba904b-b2dc-4eaf-b466-36c73f90de8a/registry-server/0.log" Oct 02 09:53:57 crc kubenswrapper[4722]: I1002 09:53:57.024472 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pc9sc_f7eec357-442f-49c5-a539-067608f33f4f/extract-utilities/0.log" Oct 02 09:53:57 crc kubenswrapper[4722]: I1002 09:53:57.026625 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pc9sc_f7eec357-442f-49c5-a539-067608f33f4f/extract-content/0.log" Oct 02 09:53:57 crc kubenswrapper[4722]: I1002 09:53:57.054069 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pc9sc_f7eec357-442f-49c5-a539-067608f33f4f/extract-content/0.log" Oct 02 09:53:57 crc kubenswrapper[4722]: I1002 09:53:57.205831 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pc9sc_f7eec357-442f-49c5-a539-067608f33f4f/extract-content/0.log" Oct 02 09:53:57 crc kubenswrapper[4722]: I1002 09:53:57.214010 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pc9sc_f7eec357-442f-49c5-a539-067608f33f4f/extract-utilities/0.log" Oct 02 09:53:57 crc kubenswrapper[4722]: I1002 09:53:57.300048 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pc9sc_f7eec357-442f-49c5-a539-067608f33f4f/registry-server/0.log" Oct 02 09:53:57 crc kubenswrapper[4722]: I1002 09:53:57.414456 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k6qks_987e526d-2094-4eb5-a911-c58aee2b53a4/extract-utilities/0.log" Oct 02 09:53:57 crc kubenswrapper[4722]: I1002 09:53:57.593155 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k6qks_987e526d-2094-4eb5-a911-c58aee2b53a4/extract-utilities/0.log" Oct 02 09:53:57 crc kubenswrapper[4722]: I1002 09:53:57.607843 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k6qks_987e526d-2094-4eb5-a911-c58aee2b53a4/extract-content/0.log" Oct 02 09:53:57 crc kubenswrapper[4722]: I1002 09:53:57.616175 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k6qks_987e526d-2094-4eb5-a911-c58aee2b53a4/extract-content/0.log" Oct 02 09:53:57 crc kubenswrapper[4722]: I1002 09:53:57.817242 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k6qks_987e526d-2094-4eb5-a911-c58aee2b53a4/extract-utilities/0.log" Oct 02 09:53:57 crc kubenswrapper[4722]: I1002 09:53:57.823159 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k6qks_987e526d-2094-4eb5-a911-c58aee2b53a4/extract-content/0.log" Oct 02 09:53:58 crc kubenswrapper[4722]: I1002 09:53:58.039365 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k6qks_987e526d-2094-4eb5-a911-c58aee2b53a4/registry-server/0.log" Oct 02 09:54:25 crc kubenswrapper[4722]: I1002 09:54:25.534238 4722 scope.go:117] "RemoveContainer" containerID="3563eca56b33edd989039c37a68c6da54c9c98a40cd0994f4339847b4426b7be" Oct 02 09:54:25 crc kubenswrapper[4722]: I1002 09:54:25.557357 4722 scope.go:117] "RemoveContainer" containerID="d3b357163a5285eb305a6bc4e9bdd9c06651e5234fe897816b25d2186225f903" Oct 02 09:54:25 crc kubenswrapper[4722]: I1002 09:54:25.580973 4722 scope.go:117] "RemoveContainer" containerID="b14120a39461b996163f77b4b4ed0c74bf772aa3d673d899cbb7032d984b076c" Oct 02 09:55:04 crc kubenswrapper[4722]: I1002 09:55:04.130928 4722 generic.go:334] "Generic (PLEG): container finished" podID="aec8a171-5018-4091-8875-4496093fcfee" containerID="5462d902b828085eb4de02458ce5e40494bb752a8ae1a31ff9e580f368575cbb" exitCode=0 Oct 02 09:55:04 crc kubenswrapper[4722]: I1002 09:55:04.131530 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ddsj9/must-gather-kfcc8" event={"ID":"aec8a171-5018-4091-8875-4496093fcfee","Type":"ContainerDied","Data":"5462d902b828085eb4de02458ce5e40494bb752a8ae1a31ff9e580f368575cbb"} Oct 02 09:55:04 crc kubenswrapper[4722]: I1002 09:55:04.132065 4722 scope.go:117] "RemoveContainer" containerID="5462d902b828085eb4de02458ce5e40494bb752a8ae1a31ff9e580f368575cbb" Oct 02 09:55:04 crc kubenswrapper[4722]: I1002 09:55:04.222505 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ddsj9_must-gather-kfcc8_aec8a171-5018-4091-8875-4496093fcfee/gather/0.log" Oct 02 09:55:10 crc kubenswrapper[4722]: I1002 09:55:10.561503 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ddsj9/must-gather-kfcc8"] Oct 02 09:55:10 crc kubenswrapper[4722]: I1002 09:55:10.563445 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-ddsj9/must-gather-kfcc8" podUID="aec8a171-5018-4091-8875-4496093fcfee" containerName="copy" containerID="cri-o://9382a37dc67b003dee38a7eb272d01333a1c9a6a2e0e094d1fd5144e30b032cf" gracePeriod=2 Oct 02 09:55:10 crc kubenswrapper[4722]: I1002 09:55:10.566895 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ddsj9/must-gather-kfcc8"] Oct 02 09:55:10 crc kubenswrapper[4722]: I1002 09:55:10.939825 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ddsj9_must-gather-kfcc8_aec8a171-5018-4091-8875-4496093fcfee/copy/0.log" Oct 02 09:55:10 crc kubenswrapper[4722]: I1002 09:55:10.941088 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ddsj9/must-gather-kfcc8" Oct 02 09:55:11 crc kubenswrapper[4722]: I1002 09:55:11.000095 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/aec8a171-5018-4091-8875-4496093fcfee-must-gather-output\") pod \"aec8a171-5018-4091-8875-4496093fcfee\" (UID: \"aec8a171-5018-4091-8875-4496093fcfee\") " Oct 02 09:55:11 crc kubenswrapper[4722]: I1002 09:55:11.000200 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjhpv\" (UniqueName: \"kubernetes.io/projected/aec8a171-5018-4091-8875-4496093fcfee-kube-api-access-vjhpv\") pod \"aec8a171-5018-4091-8875-4496093fcfee\" (UID: \"aec8a171-5018-4091-8875-4496093fcfee\") " Oct 02 09:55:11 crc kubenswrapper[4722]: I1002 09:55:11.006522 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aec8a171-5018-4091-8875-4496093fcfee-kube-api-access-vjhpv" (OuterVolumeSpecName: "kube-api-access-vjhpv") pod "aec8a171-5018-4091-8875-4496093fcfee" (UID: "aec8a171-5018-4091-8875-4496093fcfee"). InnerVolumeSpecName "kube-api-access-vjhpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:55:11 crc kubenswrapper[4722]: I1002 09:55:11.060843 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aec8a171-5018-4091-8875-4496093fcfee-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "aec8a171-5018-4091-8875-4496093fcfee" (UID: "aec8a171-5018-4091-8875-4496093fcfee"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:55:11 crc kubenswrapper[4722]: I1002 09:55:11.102298 4722 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/aec8a171-5018-4091-8875-4496093fcfee-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 02 09:55:11 crc kubenswrapper[4722]: I1002 09:55:11.102356 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjhpv\" (UniqueName: \"kubernetes.io/projected/aec8a171-5018-4091-8875-4496093fcfee-kube-api-access-vjhpv\") on node \"crc\" DevicePath \"\"" Oct 02 09:55:11 crc kubenswrapper[4722]: I1002 09:55:11.209863 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ddsj9_must-gather-kfcc8_aec8a171-5018-4091-8875-4496093fcfee/copy/0.log" Oct 02 09:55:11 crc kubenswrapper[4722]: I1002 09:55:11.210357 4722 generic.go:334] "Generic (PLEG): container finished" podID="aec8a171-5018-4091-8875-4496093fcfee" containerID="9382a37dc67b003dee38a7eb272d01333a1c9a6a2e0e094d1fd5144e30b032cf" exitCode=143 Oct 02 09:55:11 crc kubenswrapper[4722]: I1002 09:55:11.210379 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ddsj9/must-gather-kfcc8" Oct 02 09:55:11 crc kubenswrapper[4722]: I1002 09:55:11.210441 4722 scope.go:117] "RemoveContainer" containerID="9382a37dc67b003dee38a7eb272d01333a1c9a6a2e0e094d1fd5144e30b032cf" Oct 02 09:55:11 crc kubenswrapper[4722]: I1002 09:55:11.231058 4722 scope.go:117] "RemoveContainer" containerID="5462d902b828085eb4de02458ce5e40494bb752a8ae1a31ff9e580f368575cbb" Oct 02 09:55:11 crc kubenswrapper[4722]: I1002 09:55:11.278766 4722 scope.go:117] "RemoveContainer" containerID="9382a37dc67b003dee38a7eb272d01333a1c9a6a2e0e094d1fd5144e30b032cf" Oct 02 09:55:11 crc kubenswrapper[4722]: E1002 09:55:11.279494 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9382a37dc67b003dee38a7eb272d01333a1c9a6a2e0e094d1fd5144e30b032cf\": container with ID starting with 9382a37dc67b003dee38a7eb272d01333a1c9a6a2e0e094d1fd5144e30b032cf not found: ID does not exist" containerID="9382a37dc67b003dee38a7eb272d01333a1c9a6a2e0e094d1fd5144e30b032cf" Oct 02 09:55:11 crc kubenswrapper[4722]: I1002 09:55:11.279529 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9382a37dc67b003dee38a7eb272d01333a1c9a6a2e0e094d1fd5144e30b032cf"} err="failed to get container status \"9382a37dc67b003dee38a7eb272d01333a1c9a6a2e0e094d1fd5144e30b032cf\": rpc error: code = NotFound desc = could not find container \"9382a37dc67b003dee38a7eb272d01333a1c9a6a2e0e094d1fd5144e30b032cf\": container with ID starting with 9382a37dc67b003dee38a7eb272d01333a1c9a6a2e0e094d1fd5144e30b032cf not found: ID does not exist" Oct 02 09:55:11 crc kubenswrapper[4722]: I1002 09:55:11.279583 4722 scope.go:117] "RemoveContainer" containerID="5462d902b828085eb4de02458ce5e40494bb752a8ae1a31ff9e580f368575cbb" Oct 02 09:55:11 crc kubenswrapper[4722]: E1002 09:55:11.280157 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5462d902b828085eb4de02458ce5e40494bb752a8ae1a31ff9e580f368575cbb\": container with ID starting with 5462d902b828085eb4de02458ce5e40494bb752a8ae1a31ff9e580f368575cbb not found: ID does not exist" containerID="5462d902b828085eb4de02458ce5e40494bb752a8ae1a31ff9e580f368575cbb" Oct 02 09:55:11 crc kubenswrapper[4722]: I1002 09:55:11.280191 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5462d902b828085eb4de02458ce5e40494bb752a8ae1a31ff9e580f368575cbb"} err="failed to get container status \"5462d902b828085eb4de02458ce5e40494bb752a8ae1a31ff9e580f368575cbb\": rpc error: code = NotFound desc = could not find container \"5462d902b828085eb4de02458ce5e40494bb752a8ae1a31ff9e580f368575cbb\": container with ID starting with 5462d902b828085eb4de02458ce5e40494bb752a8ae1a31ff9e580f368575cbb not found: ID does not exist" Oct 02 09:55:12 crc kubenswrapper[4722]: I1002 09:55:12.717805 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aec8a171-5018-4091-8875-4496093fcfee" path="/var/lib/kubelet/pods/aec8a171-5018-4091-8875-4496093fcfee/volumes" Oct 02 09:55:25 crc kubenswrapper[4722]: I1002 09:55:25.623175 4722 scope.go:117] "RemoveContainer" containerID="3f69325ff4c10ac54e84948534aabd30afbdd129881ae6199aa3b3e30e46b556" Oct 02 09:55:25 crc kubenswrapper[4722]: I1002 09:55:25.647536 4722 scope.go:117] "RemoveContainer" containerID="5f6438a7ccae3a1a60c661c9cfe870ab197b6051c1fe51f6ceba12f144989dd9" Oct 02 09:55:25 crc kubenswrapper[4722]: I1002 09:55:25.665061 4722 scope.go:117] "RemoveContainer" containerID="9dd14ccf921c3cac01dace2efe4378440157b80408f691d4dbff833eeac166e0" Oct 02 09:55:33 crc kubenswrapper[4722]: I1002 09:55:33.597160 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ggvgk/must-gather-bwg5z"] Oct 02 09:55:33 crc kubenswrapper[4722]: E1002 09:55:33.598576 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aec8a171-5018-4091-8875-4496093fcfee" containerName="gather" Oct 02 09:55:33 crc kubenswrapper[4722]: I1002 09:55:33.598601 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="aec8a171-5018-4091-8875-4496093fcfee" containerName="gather" Oct 02 09:55:33 crc kubenswrapper[4722]: E1002 09:55:33.598671 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aec8a171-5018-4091-8875-4496093fcfee" containerName="copy" Oct 02 09:55:33 crc kubenswrapper[4722]: I1002 09:55:33.598681 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="aec8a171-5018-4091-8875-4496093fcfee" containerName="copy" Oct 02 09:55:33 crc kubenswrapper[4722]: I1002 09:55:33.598842 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="aec8a171-5018-4091-8875-4496093fcfee" containerName="gather" Oct 02 09:55:33 crc kubenswrapper[4722]: I1002 09:55:33.598867 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="aec8a171-5018-4091-8875-4496093fcfee" containerName="copy" Oct 02 09:55:33 crc kubenswrapper[4722]: I1002 09:55:33.599687 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ggvgk/must-gather-bwg5z" Oct 02 09:55:33 crc kubenswrapper[4722]: I1002 09:55:33.603038 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-ggvgk"/"openshift-service-ca.crt" Oct 02 09:55:33 crc kubenswrapper[4722]: I1002 09:55:33.603270 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-ggvgk"/"kube-root-ca.crt" Oct 02 09:55:33 crc kubenswrapper[4722]: I1002 09:55:33.616424 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ggvgk/must-gather-bwg5z"] Oct 02 09:55:33 crc kubenswrapper[4722]: I1002 09:55:33.703017 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/739d7c43-bf53-4d6c-ac3b-cf795a7efdd3-must-gather-output\") pod \"must-gather-bwg5z\" (UID: \"739d7c43-bf53-4d6c-ac3b-cf795a7efdd3\") " pod="openshift-must-gather-ggvgk/must-gather-bwg5z" Oct 02 09:55:33 crc kubenswrapper[4722]: I1002 09:55:33.703076 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mq7f\" (UniqueName: \"kubernetes.io/projected/739d7c43-bf53-4d6c-ac3b-cf795a7efdd3-kube-api-access-9mq7f\") pod \"must-gather-bwg5z\" (UID: \"739d7c43-bf53-4d6c-ac3b-cf795a7efdd3\") " pod="openshift-must-gather-ggvgk/must-gather-bwg5z" Oct 02 09:55:33 crc kubenswrapper[4722]: I1002 09:55:33.805091 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/739d7c43-bf53-4d6c-ac3b-cf795a7efdd3-must-gather-output\") pod \"must-gather-bwg5z\" (UID: \"739d7c43-bf53-4d6c-ac3b-cf795a7efdd3\") " pod="openshift-must-gather-ggvgk/must-gather-bwg5z" Oct 02 09:55:33 crc kubenswrapper[4722]: I1002 09:55:33.805155 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mq7f\" (UniqueName: \"kubernetes.io/projected/739d7c43-bf53-4d6c-ac3b-cf795a7efdd3-kube-api-access-9mq7f\") pod \"must-gather-bwg5z\" (UID: \"739d7c43-bf53-4d6c-ac3b-cf795a7efdd3\") " pod="openshift-must-gather-ggvgk/must-gather-bwg5z" Oct 02 09:55:33 crc kubenswrapper[4722]: I1002 09:55:33.805622 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/739d7c43-bf53-4d6c-ac3b-cf795a7efdd3-must-gather-output\") pod \"must-gather-bwg5z\" (UID: \"739d7c43-bf53-4d6c-ac3b-cf795a7efdd3\") " pod="openshift-must-gather-ggvgk/must-gather-bwg5z" Oct 02 09:55:33 crc kubenswrapper[4722]: I1002 09:55:33.831287 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mq7f\" (UniqueName: \"kubernetes.io/projected/739d7c43-bf53-4d6c-ac3b-cf795a7efdd3-kube-api-access-9mq7f\") pod \"must-gather-bwg5z\" (UID: \"739d7c43-bf53-4d6c-ac3b-cf795a7efdd3\") " pod="openshift-must-gather-ggvgk/must-gather-bwg5z" Oct 02 09:55:33 crc kubenswrapper[4722]: I1002 09:55:33.923521 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ggvgk/must-gather-bwg5z" Oct 02 09:55:34 crc kubenswrapper[4722]: I1002 09:55:34.343167 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ggvgk/must-gather-bwg5z"] Oct 02 09:55:34 crc kubenswrapper[4722]: I1002 09:55:34.372987 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ggvgk/must-gather-bwg5z" event={"ID":"739d7c43-bf53-4d6c-ac3b-cf795a7efdd3","Type":"ContainerStarted","Data":"1cff7bb6d4c30683d9aa9a194d1ecf9520d0cb132a52bd20820cd9c9bb438485"} Oct 02 09:55:35 crc kubenswrapper[4722]: I1002 09:55:35.380717 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ggvgk/must-gather-bwg5z" event={"ID":"739d7c43-bf53-4d6c-ac3b-cf795a7efdd3","Type":"ContainerStarted","Data":"e6ca8bb171cf475d771eaf38b7b69e4d36d6cdfb7325ba78614063142d64013b"} Oct 02 09:55:35 crc kubenswrapper[4722]: I1002 09:55:35.381641 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ggvgk/must-gather-bwg5z" event={"ID":"739d7c43-bf53-4d6c-ac3b-cf795a7efdd3","Type":"ContainerStarted","Data":"89902c6c5f3d7a96f00deebb3854cb5d75cb8ab0a81566f4e2a1ee031337ab4e"} Oct 02 09:55:35 crc kubenswrapper[4722]: I1002 09:55:35.398357 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ggvgk/must-gather-bwg5z" podStartSLOduration=2.398336873 podStartE2EDuration="2.398336873s" podCreationTimestamp="2025-10-02 09:55:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 09:55:35.394880957 +0000 UTC m=+1391.431633957" watchObservedRunningTime="2025-10-02 09:55:35.398336873 +0000 UTC m=+1391.435089873" Oct 02 09:55:40 crc kubenswrapper[4722]: I1002 09:55:40.696684 4722 patch_prober.go:28] interesting pod/machine-config-daemon-q6h76 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 09:55:40 crc kubenswrapper[4722]: I1002 09:55:40.697015 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" podUID="f662826c-e772-4f56-a85f-83971aaef356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 09:56:10 crc kubenswrapper[4722]: I1002 09:56:10.697475 4722 patch_prober.go:28] interesting pod/machine-config-daemon-q6h76 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 09:56:10 crc kubenswrapper[4722]: I1002 09:56:10.698088 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" podUID="f662826c-e772-4f56-a85f-83971aaef356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 09:56:11 crc kubenswrapper[4722]: I1002 09:56:11.866738 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-hpfrs_bf859e5d-c395-4892-9abf-e1a9a9d21176/control-plane-machine-set-operator/0.log" Oct 02 09:56:12 crc kubenswrapper[4722]: I1002 09:56:12.010426 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-krj4j_86bc5bcd-672b-4bb6-8090-1ed2d77fff50/kube-rbac-proxy/0.log" Oct 02 09:56:12 crc kubenswrapper[4722]: I1002 09:56:12.053580 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-krj4j_86bc5bcd-672b-4bb6-8090-1ed2d77fff50/machine-api-operator/0.log" Oct 02 09:56:15 crc kubenswrapper[4722]: I1002 09:56:15.328593 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dhrqg"] Oct 02 09:56:15 crc kubenswrapper[4722]: I1002 09:56:15.330366 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dhrqg" Oct 02 09:56:15 crc kubenswrapper[4722]: I1002 09:56:15.356865 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dhrqg"] Oct 02 09:56:15 crc kubenswrapper[4722]: I1002 09:56:15.441028 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/918048fd-59ef-4782-a241-ca6ebc9e9d6b-utilities\") pod \"redhat-operators-dhrqg\" (UID: \"918048fd-59ef-4782-a241-ca6ebc9e9d6b\") " pod="openshift-marketplace/redhat-operators-dhrqg" Oct 02 09:56:15 crc kubenswrapper[4722]: I1002 09:56:15.441133 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkvr9\" (UniqueName: \"kubernetes.io/projected/918048fd-59ef-4782-a241-ca6ebc9e9d6b-kube-api-access-kkvr9\") pod \"redhat-operators-dhrqg\" (UID: \"918048fd-59ef-4782-a241-ca6ebc9e9d6b\") " pod="openshift-marketplace/redhat-operators-dhrqg" Oct 02 09:56:15 crc kubenswrapper[4722]: I1002 09:56:15.441212 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/918048fd-59ef-4782-a241-ca6ebc9e9d6b-catalog-content\") pod \"redhat-operators-dhrqg\" (UID: \"918048fd-59ef-4782-a241-ca6ebc9e9d6b\") " pod="openshift-marketplace/redhat-operators-dhrqg" Oct 02 09:56:15 crc kubenswrapper[4722]: I1002 09:56:15.542348 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/918048fd-59ef-4782-a241-ca6ebc9e9d6b-utilities\") pod \"redhat-operators-dhrqg\" (UID: \"918048fd-59ef-4782-a241-ca6ebc9e9d6b\") " pod="openshift-marketplace/redhat-operators-dhrqg" Oct 02 09:56:15 crc kubenswrapper[4722]: I1002 09:56:15.542419 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkvr9\" (UniqueName: \"kubernetes.io/projected/918048fd-59ef-4782-a241-ca6ebc9e9d6b-kube-api-access-kkvr9\") pod \"redhat-operators-dhrqg\" (UID: \"918048fd-59ef-4782-a241-ca6ebc9e9d6b\") " pod="openshift-marketplace/redhat-operators-dhrqg" Oct 02 09:56:15 crc kubenswrapper[4722]: I1002 09:56:15.542495 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/918048fd-59ef-4782-a241-ca6ebc9e9d6b-catalog-content\") pod \"redhat-operators-dhrqg\" (UID: \"918048fd-59ef-4782-a241-ca6ebc9e9d6b\") " pod="openshift-marketplace/redhat-operators-dhrqg" Oct 02 09:56:15 crc kubenswrapper[4722]: I1002 09:56:15.542971 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/918048fd-59ef-4782-a241-ca6ebc9e9d6b-utilities\") pod \"redhat-operators-dhrqg\" (UID: \"918048fd-59ef-4782-a241-ca6ebc9e9d6b\") " pod="openshift-marketplace/redhat-operators-dhrqg" Oct 02 09:56:15 crc kubenswrapper[4722]: I1002 09:56:15.543051 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/918048fd-59ef-4782-a241-ca6ebc9e9d6b-catalog-content\") pod \"redhat-operators-dhrqg\" (UID: \"918048fd-59ef-4782-a241-ca6ebc9e9d6b\") " pod="openshift-marketplace/redhat-operators-dhrqg" Oct 02 09:56:15 crc kubenswrapper[4722]: I1002 09:56:15.563470 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkvr9\" (UniqueName: \"kubernetes.io/projected/918048fd-59ef-4782-a241-ca6ebc9e9d6b-kube-api-access-kkvr9\") pod \"redhat-operators-dhrqg\" (UID: \"918048fd-59ef-4782-a241-ca6ebc9e9d6b\") " pod="openshift-marketplace/redhat-operators-dhrqg" Oct 02 09:56:15 crc kubenswrapper[4722]: I1002 09:56:15.649422 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dhrqg" Oct 02 09:56:15 crc kubenswrapper[4722]: I1002 09:56:15.899982 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dhrqg"] Oct 02 09:56:16 crc kubenswrapper[4722]: I1002 09:56:16.626768 4722 generic.go:334] "Generic (PLEG): container finished" podID="918048fd-59ef-4782-a241-ca6ebc9e9d6b" containerID="ae176bd27e9954c0eb09e11ca8b413e19e6ade0abf5ff64987757eb7aa5d31a6" exitCode=0 Oct 02 09:56:16 crc kubenswrapper[4722]: I1002 09:56:16.626833 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dhrqg" event={"ID":"918048fd-59ef-4782-a241-ca6ebc9e9d6b","Type":"ContainerDied","Data":"ae176bd27e9954c0eb09e11ca8b413e19e6ade0abf5ff64987757eb7aa5d31a6"} Oct 02 09:56:16 crc kubenswrapper[4722]: I1002 09:56:16.627099 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dhrqg" event={"ID":"918048fd-59ef-4782-a241-ca6ebc9e9d6b","Type":"ContainerStarted","Data":"1225b1ec64d534e4990c4ce44fc424f8f596915aed5896a1a6fe964740cdf289"} Oct 02 09:56:16 crc kubenswrapper[4722]: I1002 09:56:16.628521 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 09:56:17 crc kubenswrapper[4722]: I1002 09:56:17.634214 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dhrqg" event={"ID":"918048fd-59ef-4782-a241-ca6ebc9e9d6b","Type":"ContainerStarted","Data":"947aa5ebb91bda72a0a44a92933c7636c4d761e5be74dea0e8e23702a408856f"} Oct 02 09:56:18 crc kubenswrapper[4722]: I1002 09:56:18.651308 4722 generic.go:334] "Generic (PLEG): container finished" podID="918048fd-59ef-4782-a241-ca6ebc9e9d6b" containerID="947aa5ebb91bda72a0a44a92933c7636c4d761e5be74dea0e8e23702a408856f" exitCode=0 Oct 02 09:56:18 crc kubenswrapper[4722]: I1002 09:56:18.651374 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dhrqg" event={"ID":"918048fd-59ef-4782-a241-ca6ebc9e9d6b","Type":"ContainerDied","Data":"947aa5ebb91bda72a0a44a92933c7636c4d761e5be74dea0e8e23702a408856f"} Oct 02 09:56:18 crc kubenswrapper[4722]: I1002 09:56:18.896591 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-z5cdd"] Oct 02 09:56:18 crc kubenswrapper[4722]: I1002 09:56:18.897754 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z5cdd" Oct 02 09:56:18 crc kubenswrapper[4722]: I1002 09:56:18.917837 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z5cdd"] Oct 02 09:56:18 crc kubenswrapper[4722]: I1002 09:56:18.991323 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/395a540f-a004-4d25-b209-c65b1e1373e9-catalog-content\") pod \"certified-operators-z5cdd\" (UID: \"395a540f-a004-4d25-b209-c65b1e1373e9\") " pod="openshift-marketplace/certified-operators-z5cdd" Oct 02 09:56:18 crc kubenswrapper[4722]: I1002 09:56:18.991661 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s85vd\" (UniqueName: \"kubernetes.io/projected/395a540f-a004-4d25-b209-c65b1e1373e9-kube-api-access-s85vd\") pod \"certified-operators-z5cdd\" (UID: \"395a540f-a004-4d25-b209-c65b1e1373e9\") " pod="openshift-marketplace/certified-operators-z5cdd" Oct 02 09:56:18 crc kubenswrapper[4722]: I1002 09:56:18.991699 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/395a540f-a004-4d25-b209-c65b1e1373e9-utilities\") pod \"certified-operators-z5cdd\" (UID: \"395a540f-a004-4d25-b209-c65b1e1373e9\") " pod="openshift-marketplace/certified-operators-z5cdd" Oct 02 09:56:19 crc kubenswrapper[4722]: I1002 09:56:19.092491 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/395a540f-a004-4d25-b209-c65b1e1373e9-catalog-content\") pod \"certified-operators-z5cdd\" (UID: \"395a540f-a004-4d25-b209-c65b1e1373e9\") " pod="openshift-marketplace/certified-operators-z5cdd" Oct 02 09:56:19 crc kubenswrapper[4722]: I1002 09:56:19.092569 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s85vd\" (UniqueName: \"kubernetes.io/projected/395a540f-a004-4d25-b209-c65b1e1373e9-kube-api-access-s85vd\") pod \"certified-operators-z5cdd\" (UID: \"395a540f-a004-4d25-b209-c65b1e1373e9\") " pod="openshift-marketplace/certified-operators-z5cdd" Oct 02 09:56:19 crc kubenswrapper[4722]: I1002 09:56:19.092610 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/395a540f-a004-4d25-b209-c65b1e1373e9-utilities\") pod \"certified-operators-z5cdd\" (UID: \"395a540f-a004-4d25-b209-c65b1e1373e9\") " pod="openshift-marketplace/certified-operators-z5cdd" Oct 02 09:56:19 crc kubenswrapper[4722]: I1002 09:56:19.093061 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/395a540f-a004-4d25-b209-c65b1e1373e9-catalog-content\") pod \"certified-operators-z5cdd\" (UID: \"395a540f-a004-4d25-b209-c65b1e1373e9\") " pod="openshift-marketplace/certified-operators-z5cdd" Oct 02 09:56:19 crc kubenswrapper[4722]: I1002 09:56:19.093159 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/395a540f-a004-4d25-b209-c65b1e1373e9-utilities\") pod \"certified-operators-z5cdd\" (UID: \"395a540f-a004-4d25-b209-c65b1e1373e9\") " pod="openshift-marketplace/certified-operators-z5cdd" Oct 02 09:56:19 crc kubenswrapper[4722]: I1002 09:56:19.119179 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s85vd\" (UniqueName: \"kubernetes.io/projected/395a540f-a004-4d25-b209-c65b1e1373e9-kube-api-access-s85vd\") pod \"certified-operators-z5cdd\" (UID: \"395a540f-a004-4d25-b209-c65b1e1373e9\") " pod="openshift-marketplace/certified-operators-z5cdd" Oct 02 09:56:19 crc kubenswrapper[4722]: I1002 09:56:19.217963 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z5cdd" Oct 02 09:56:19 crc kubenswrapper[4722]: I1002 09:56:19.630680 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z5cdd"] Oct 02 09:56:19 crc kubenswrapper[4722]: W1002 09:56:19.636046 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod395a540f_a004_4d25_b209_c65b1e1373e9.slice/crio-d14305f249c95d58f888ba82ae4c55a35fbed523dc54df7f2ac555fe59637275 WatchSource:0}: Error finding container d14305f249c95d58f888ba82ae4c55a35fbed523dc54df7f2ac555fe59637275: Status 404 returned error can't find the container with id d14305f249c95d58f888ba82ae4c55a35fbed523dc54df7f2ac555fe59637275 Oct 02 09:56:19 crc kubenswrapper[4722]: I1002 09:56:19.658393 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z5cdd" event={"ID":"395a540f-a004-4d25-b209-c65b1e1373e9","Type":"ContainerStarted","Data":"d14305f249c95d58f888ba82ae4c55a35fbed523dc54df7f2ac555fe59637275"} Oct 02 09:56:19 crc kubenswrapper[4722]: I1002 09:56:19.660644 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dhrqg" event={"ID":"918048fd-59ef-4782-a241-ca6ebc9e9d6b","Type":"ContainerStarted","Data":"8f78f4a84b516de5274d0df1f56b2d2c8f13a5b8017122720590a62bb2505081"} Oct 02 09:56:19 crc kubenswrapper[4722]: I1002 09:56:19.683076 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dhrqg" podStartSLOduration=2.22962735 podStartE2EDuration="4.683059011s" podCreationTimestamp="2025-10-02 09:56:15 +0000 UTC" firstStartedPulling="2025-10-02 09:56:16.628259921 +0000 UTC m=+1432.665012901" lastFinishedPulling="2025-10-02 09:56:19.081691582 +0000 UTC m=+1435.118444562" observedRunningTime="2025-10-02 09:56:19.681758448 +0000 UTC m=+1435.718511448" watchObservedRunningTime="2025-10-02 09:56:19.683059011 +0000 UTC m=+1435.719812001" Oct 02 09:56:20 crc kubenswrapper[4722]: I1002 09:56:20.667329 4722 generic.go:334] "Generic (PLEG): container finished" podID="395a540f-a004-4d25-b209-c65b1e1373e9" containerID="107a7cc16fdf50735ad7aa43003a7a26ed5d968a0ad547c1bd9bc5503fb82ff2" exitCode=0 Oct 02 09:56:20 crc kubenswrapper[4722]: I1002 09:56:20.667411 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z5cdd" event={"ID":"395a540f-a004-4d25-b209-c65b1e1373e9","Type":"ContainerDied","Data":"107a7cc16fdf50735ad7aa43003a7a26ed5d968a0ad547c1bd9bc5503fb82ff2"} Oct 02 09:56:21 crc kubenswrapper[4722]: I1002 09:56:21.675265 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z5cdd" event={"ID":"395a540f-a004-4d25-b209-c65b1e1373e9","Type":"ContainerStarted","Data":"7c9cf60508ac94ab183e5b5539b4e238e5fb926f862d1ed7394d18678e49d311"} Oct 02 09:56:22 crc kubenswrapper[4722]: I1002 09:56:22.682985 4722 generic.go:334] "Generic (PLEG): container finished" podID="395a540f-a004-4d25-b209-c65b1e1373e9" containerID="7c9cf60508ac94ab183e5b5539b4e238e5fb926f862d1ed7394d18678e49d311" exitCode=0 Oct 02 09:56:22 crc kubenswrapper[4722]: I1002 09:56:22.683061 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z5cdd" event={"ID":"395a540f-a004-4d25-b209-c65b1e1373e9","Type":"ContainerDied","Data":"7c9cf60508ac94ab183e5b5539b4e238e5fb926f862d1ed7394d18678e49d311"} Oct 02 09:56:23 crc kubenswrapper[4722]: I1002 09:56:23.691827 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z5cdd" event={"ID":"395a540f-a004-4d25-b209-c65b1e1373e9","Type":"ContainerStarted","Data":"c2eed5b2627e2078370c3d28247b71ac19c40265b10499248e7a0081fcc74c2a"} Oct 02 09:56:23 crc kubenswrapper[4722]: I1002 09:56:23.709783 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-z5cdd" podStartSLOduration=3.03412457 podStartE2EDuration="5.709758773s" podCreationTimestamp="2025-10-02 09:56:18 +0000 UTC" firstStartedPulling="2025-10-02 09:56:20.669455003 +0000 UTC m=+1436.706207983" lastFinishedPulling="2025-10-02 09:56:23.345089206 +0000 UTC m=+1439.381842186" observedRunningTime="2025-10-02 09:56:23.707147438 +0000 UTC m=+1439.743900438" watchObservedRunningTime="2025-10-02 09:56:23.709758773 +0000 UTC m=+1439.746511753" Oct 02 09:56:25 crc kubenswrapper[4722]: I1002 09:56:25.650548 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dhrqg" Oct 02 09:56:25 crc kubenswrapper[4722]: I1002 09:56:25.651256 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dhrqg" Oct 02 09:56:25 crc kubenswrapper[4722]: I1002 09:56:25.692421 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dhrqg" Oct 02 09:56:25 crc kubenswrapper[4722]: I1002 09:56:25.716955 4722 scope.go:117] "RemoveContainer" containerID="8e3ba57210dc8d1c17bd479fd8090305ab6c61257a5e795542328de4a413f4f8" Oct 02 09:56:25 crc kubenswrapper[4722]: I1002 09:56:25.738261 4722 scope.go:117] "RemoveContainer" containerID="72e6cb9e7b3a51db3544adb68d8fca6905cc97b68c242ef8da4a19cd9f479b97" Oct 02 09:56:25 crc kubenswrapper[4722]: I1002 09:56:25.745996 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dhrqg" Oct 02 09:56:25 crc kubenswrapper[4722]: I1002 09:56:25.758905 4722 scope.go:117] "RemoveContainer" containerID="ab900f218fdfd68429e8fcf3ec42ad27d8504528ce371c6d45b3146b1cf402f2" Oct 02 09:56:25 crc kubenswrapper[4722]: I1002 09:56:25.782102 4722 scope.go:117] "RemoveContainer" containerID="4642f406934c0358bbd514277de06a9df9a755c6bd1a976ec91ea2a636cbf1b7" Oct 02 09:56:25 crc kubenswrapper[4722]: I1002 09:56:25.805967 4722 scope.go:117] "RemoveContainer" containerID="ed3d756892bd016811916bf95d762940d1155429d135e9473d4c5dce7772a53d" Oct 02 09:56:26 crc kubenswrapper[4722]: I1002 09:56:26.890811 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dhrqg"] Oct 02 09:56:27 crc kubenswrapper[4722]: I1002 09:56:27.203669 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-62lfz_0b8fc405-8054-44d8-8213-089c88a2c0a4/kube-rbac-proxy/0.log" Oct 02 09:56:27 crc kubenswrapper[4722]: I1002 09:56:27.263716 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-62lfz_0b8fc405-8054-44d8-8213-089c88a2c0a4/controller/0.log" Oct 02 09:56:27 crc kubenswrapper[4722]: I1002 09:56:27.428211 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pl8wr_c20d6613-a91d-4afc-96a1-efbdffba4415/cp-frr-files/0.log" Oct 02 09:56:27 crc kubenswrapper[4722]: I1002 09:56:27.578462 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pl8wr_c20d6613-a91d-4afc-96a1-efbdffba4415/cp-metrics/0.log" Oct 02 09:56:27 crc kubenswrapper[4722]: I1002 09:56:27.600920 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pl8wr_c20d6613-a91d-4afc-96a1-efbdffba4415/cp-frr-files/0.log" Oct 02 09:56:27 crc kubenswrapper[4722]: I1002 09:56:27.617740 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pl8wr_c20d6613-a91d-4afc-96a1-efbdffba4415/cp-reloader/0.log" Oct 02 09:56:27 crc kubenswrapper[4722]: I1002 09:56:27.624101 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pl8wr_c20d6613-a91d-4afc-96a1-efbdffba4415/cp-reloader/0.log" Oct 02 09:56:27 crc kubenswrapper[4722]: I1002 09:56:27.715642 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dhrqg" podUID="918048fd-59ef-4782-a241-ca6ebc9e9d6b" containerName="registry-server" containerID="cri-o://8f78f4a84b516de5274d0df1f56b2d2c8f13a5b8017122720590a62bb2505081" gracePeriod=2 Oct 02 09:56:27 crc kubenswrapper[4722]: I1002 09:56:27.842504 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pl8wr_c20d6613-a91d-4afc-96a1-efbdffba4415/cp-frr-files/0.log" Oct 02 09:56:27 crc kubenswrapper[4722]: I1002 09:56:27.856967 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pl8wr_c20d6613-a91d-4afc-96a1-efbdffba4415/cp-reloader/0.log" Oct 02 09:56:27 crc kubenswrapper[4722]: I1002 09:56:27.860940 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pl8wr_c20d6613-a91d-4afc-96a1-efbdffba4415/cp-metrics/0.log" Oct 02 09:56:27 crc kubenswrapper[4722]: I1002 09:56:27.885264 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pl8wr_c20d6613-a91d-4afc-96a1-efbdffba4415/cp-metrics/0.log" Oct 02 09:56:28 crc kubenswrapper[4722]: I1002 09:56:28.098057 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pl8wr_c20d6613-a91d-4afc-96a1-efbdffba4415/cp-frr-files/0.log" Oct 02 09:56:28 crc kubenswrapper[4722]: I1002 09:56:28.103599 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dhrqg" Oct 02 09:56:28 crc kubenswrapper[4722]: I1002 09:56:28.110920 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pl8wr_c20d6613-a91d-4afc-96a1-efbdffba4415/controller/0.log" Oct 02 09:56:28 crc kubenswrapper[4722]: I1002 09:56:28.132105 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pl8wr_c20d6613-a91d-4afc-96a1-efbdffba4415/cp-reloader/0.log" Oct 02 09:56:28 crc kubenswrapper[4722]: I1002 09:56:28.154677 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pl8wr_c20d6613-a91d-4afc-96a1-efbdffba4415/cp-metrics/0.log" Oct 02 09:56:28 crc kubenswrapper[4722]: I1002 09:56:28.221889 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkvr9\" (UniqueName: \"kubernetes.io/projected/918048fd-59ef-4782-a241-ca6ebc9e9d6b-kube-api-access-kkvr9\") pod \"918048fd-59ef-4782-a241-ca6ebc9e9d6b\" (UID: \"918048fd-59ef-4782-a241-ca6ebc9e9d6b\") " Oct 02 09:56:28 crc kubenswrapper[4722]: I1002 09:56:28.222302 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/918048fd-59ef-4782-a241-ca6ebc9e9d6b-catalog-content\") pod \"918048fd-59ef-4782-a241-ca6ebc9e9d6b\" (UID: \"918048fd-59ef-4782-a241-ca6ebc9e9d6b\") " Oct 02 09:56:28 crc kubenswrapper[4722]: I1002 09:56:28.222491 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/918048fd-59ef-4782-a241-ca6ebc9e9d6b-utilities\") pod \"918048fd-59ef-4782-a241-ca6ebc9e9d6b\" (UID: \"918048fd-59ef-4782-a241-ca6ebc9e9d6b\") " Oct 02 09:56:28 crc kubenswrapper[4722]: I1002 09:56:28.223243 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/918048fd-59ef-4782-a241-ca6ebc9e9d6b-utilities" (OuterVolumeSpecName: "utilities") pod "918048fd-59ef-4782-a241-ca6ebc9e9d6b" (UID: "918048fd-59ef-4782-a241-ca6ebc9e9d6b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:56:28 crc kubenswrapper[4722]: I1002 09:56:28.230295 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/918048fd-59ef-4782-a241-ca6ebc9e9d6b-kube-api-access-kkvr9" (OuterVolumeSpecName: "kube-api-access-kkvr9") pod "918048fd-59ef-4782-a241-ca6ebc9e9d6b" (UID: "918048fd-59ef-4782-a241-ca6ebc9e9d6b"). InnerVolumeSpecName "kube-api-access-kkvr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:56:28 crc kubenswrapper[4722]: I1002 09:56:28.323571 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkvr9\" (UniqueName: \"kubernetes.io/projected/918048fd-59ef-4782-a241-ca6ebc9e9d6b-kube-api-access-kkvr9\") on node \"crc\" DevicePath \"\"" Oct 02 09:56:28 crc kubenswrapper[4722]: I1002 09:56:28.323602 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/918048fd-59ef-4782-a241-ca6ebc9e9d6b-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 09:56:28 crc kubenswrapper[4722]: I1002 09:56:28.330855 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pl8wr_c20d6613-a91d-4afc-96a1-efbdffba4415/frr-metrics/0.log" Oct 02 09:56:28 crc kubenswrapper[4722]: I1002 09:56:28.351823 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pl8wr_c20d6613-a91d-4afc-96a1-efbdffba4415/kube-rbac-proxy/0.log" Oct 02 09:56:28 crc kubenswrapper[4722]: I1002 09:56:28.363373 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pl8wr_c20d6613-a91d-4afc-96a1-efbdffba4415/kube-rbac-proxy-frr/0.log" Oct 02 09:56:28 crc kubenswrapper[4722]: I1002 09:56:28.524494 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pl8wr_c20d6613-a91d-4afc-96a1-efbdffba4415/reloader/0.log" Oct 02 09:56:28 crc kubenswrapper[4722]: I1002 09:56:28.567017 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-zj8n7_17b8e73d-a3a0-4f67-9634-8e7f9f6038e8/frr-k8s-webhook-server/0.log" Oct 02 09:56:28 crc kubenswrapper[4722]: I1002 09:56:28.732064 4722 generic.go:334] "Generic (PLEG): container finished" podID="918048fd-59ef-4782-a241-ca6ebc9e9d6b" containerID="8f78f4a84b516de5274d0df1f56b2d2c8f13a5b8017122720590a62bb2505081" exitCode=0 Oct 02 09:56:28 crc kubenswrapper[4722]: I1002 09:56:28.732283 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dhrqg" Oct 02 09:56:28 crc kubenswrapper[4722]: I1002 09:56:28.732365 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dhrqg" event={"ID":"918048fd-59ef-4782-a241-ca6ebc9e9d6b","Type":"ContainerDied","Data":"8f78f4a84b516de5274d0df1f56b2d2c8f13a5b8017122720590a62bb2505081"} Oct 02 09:56:28 crc kubenswrapper[4722]: I1002 09:56:28.733587 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dhrqg" event={"ID":"918048fd-59ef-4782-a241-ca6ebc9e9d6b","Type":"ContainerDied","Data":"1225b1ec64d534e4990c4ce44fc424f8f596915aed5896a1a6fe964740cdf289"} Oct 02 09:56:28 crc kubenswrapper[4722]: I1002 09:56:28.733634 4722 scope.go:117] "RemoveContainer" containerID="8f78f4a84b516de5274d0df1f56b2d2c8f13a5b8017122720590a62bb2505081" Oct 02 09:56:28 crc kubenswrapper[4722]: I1002 09:56:28.763040 4722 scope.go:117] "RemoveContainer" containerID="947aa5ebb91bda72a0a44a92933c7636c4d761e5be74dea0e8e23702a408856f" Oct 02 09:56:28 crc kubenswrapper[4722]: I1002 09:56:28.778022 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pl8wr_c20d6613-a91d-4afc-96a1-efbdffba4415/frr/0.log" Oct 02 09:56:28 crc kubenswrapper[4722]: I1002 09:56:28.793147 4722 scope.go:117] "RemoveContainer" containerID="ae176bd27e9954c0eb09e11ca8b413e19e6ade0abf5ff64987757eb7aa5d31a6" Oct 02 09:56:28 crc kubenswrapper[4722]: I1002 09:56:28.821656 4722 scope.go:117] "RemoveContainer" containerID="8f78f4a84b516de5274d0df1f56b2d2c8f13a5b8017122720590a62bb2505081" Oct 02 09:56:28 crc kubenswrapper[4722]: E1002 09:56:28.822497 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f78f4a84b516de5274d0df1f56b2d2c8f13a5b8017122720590a62bb2505081\": container with ID starting with 8f78f4a84b516de5274d0df1f56b2d2c8f13a5b8017122720590a62bb2505081 not found: ID does not exist" containerID="8f78f4a84b516de5274d0df1f56b2d2c8f13a5b8017122720590a62bb2505081" Oct 02 09:56:28 crc kubenswrapper[4722]: I1002 09:56:28.822538 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f78f4a84b516de5274d0df1f56b2d2c8f13a5b8017122720590a62bb2505081"} err="failed to get container status \"8f78f4a84b516de5274d0df1f56b2d2c8f13a5b8017122720590a62bb2505081\": rpc error: code = NotFound desc = could not find container \"8f78f4a84b516de5274d0df1f56b2d2c8f13a5b8017122720590a62bb2505081\": container with ID starting with 8f78f4a84b516de5274d0df1f56b2d2c8f13a5b8017122720590a62bb2505081 not found: ID does not exist" Oct 02 09:56:28 crc kubenswrapper[4722]: I1002 09:56:28.822569 4722 scope.go:117] "RemoveContainer" containerID="947aa5ebb91bda72a0a44a92933c7636c4d761e5be74dea0e8e23702a408856f" Oct 02 09:56:28 crc kubenswrapper[4722]: E1002 09:56:28.823144 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"947aa5ebb91bda72a0a44a92933c7636c4d761e5be74dea0e8e23702a408856f\": container with ID starting with 947aa5ebb91bda72a0a44a92933c7636c4d761e5be74dea0e8e23702a408856f not found: ID does not exist" containerID="947aa5ebb91bda72a0a44a92933c7636c4d761e5be74dea0e8e23702a408856f" Oct 02 09:56:28 crc kubenswrapper[4722]: I1002 09:56:28.823168 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"947aa5ebb91bda72a0a44a92933c7636c4d761e5be74dea0e8e23702a408856f"} err="failed to get container status \"947aa5ebb91bda72a0a44a92933c7636c4d761e5be74dea0e8e23702a408856f\": rpc error: code = NotFound desc = could not find container \"947aa5ebb91bda72a0a44a92933c7636c4d761e5be74dea0e8e23702a408856f\": container with ID starting with 947aa5ebb91bda72a0a44a92933c7636c4d761e5be74dea0e8e23702a408856f not found: ID does not exist" Oct 02 09:56:28 crc kubenswrapper[4722]: I1002 09:56:28.823185 4722 scope.go:117] "RemoveContainer" containerID="ae176bd27e9954c0eb09e11ca8b413e19e6ade0abf5ff64987757eb7aa5d31a6" Oct 02 09:56:28 crc kubenswrapper[4722]: E1002 09:56:28.823829 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae176bd27e9954c0eb09e11ca8b413e19e6ade0abf5ff64987757eb7aa5d31a6\": container with ID starting with ae176bd27e9954c0eb09e11ca8b413e19e6ade0abf5ff64987757eb7aa5d31a6 not found: ID does not exist" containerID="ae176bd27e9954c0eb09e11ca8b413e19e6ade0abf5ff64987757eb7aa5d31a6" Oct 02 09:56:28 crc kubenswrapper[4722]: I1002 09:56:28.823912 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae176bd27e9954c0eb09e11ca8b413e19e6ade0abf5ff64987757eb7aa5d31a6"} err="failed to get container status \"ae176bd27e9954c0eb09e11ca8b413e19e6ade0abf5ff64987757eb7aa5d31a6\": rpc error: code = NotFound desc = could not find container \"ae176bd27e9954c0eb09e11ca8b413e19e6ade0abf5ff64987757eb7aa5d31a6\": container with ID starting with ae176bd27e9954c0eb09e11ca8b413e19e6ade0abf5ff64987757eb7aa5d31a6 not found: ID does not exist" Oct 02 09:56:28 crc kubenswrapper[4722]: I1002 09:56:28.859456 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5559d9646f-85qkg_68b65273-5373-4afe-86e2-e252c7d5b8a2/manager/0.log" Oct 02 09:56:29 crc kubenswrapper[4722]: I1002 09:56:29.021342 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-clhg2_97a5fee1-9fe2-4292-82d0-a743eff0ecbf/kube-rbac-proxy/0.log" Oct 02 09:56:29 crc kubenswrapper[4722]: I1002 09:56:29.051277 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-546df5c677-k4c6f_cc0ccdf8-90ba-43dc-9c71-e002f1fe038c/webhook-server/0.log" Oct 02 09:56:29 crc kubenswrapper[4722]: I1002 09:56:29.220665 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-z5cdd" Oct 02 09:56:29 crc kubenswrapper[4722]: I1002 09:56:29.220721 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-z5cdd" Oct 02 09:56:29 crc kubenswrapper[4722]: I1002 09:56:29.258058 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-clhg2_97a5fee1-9fe2-4292-82d0-a743eff0ecbf/speaker/0.log" Oct 02 09:56:29 crc kubenswrapper[4722]: I1002 09:56:29.269382 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-z5cdd" Oct 02 09:56:29 crc kubenswrapper[4722]: I1002 09:56:29.409818 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/918048fd-59ef-4782-a241-ca6ebc9e9d6b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "918048fd-59ef-4782-a241-ca6ebc9e9d6b" (UID: "918048fd-59ef-4782-a241-ca6ebc9e9d6b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:56:29 crc kubenswrapper[4722]: I1002 09:56:29.440625 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/918048fd-59ef-4782-a241-ca6ebc9e9d6b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 09:56:29 crc kubenswrapper[4722]: I1002 09:56:29.661822 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dhrqg"] Oct 02 09:56:29 crc kubenswrapper[4722]: I1002 09:56:29.668587 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dhrqg"] Oct 02 09:56:29 crc kubenswrapper[4722]: I1002 09:56:29.785211 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-z5cdd" Oct 02 09:56:30 crc kubenswrapper[4722]: I1002 09:56:30.719425 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="918048fd-59ef-4782-a241-ca6ebc9e9d6b" path="/var/lib/kubelet/pods/918048fd-59ef-4782-a241-ca6ebc9e9d6b/volumes" Oct 02 09:56:30 crc kubenswrapper[4722]: I1002 09:56:30.892846 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z5cdd"] Oct 02 09:56:31 crc kubenswrapper[4722]: I1002 09:56:31.753476 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-z5cdd" podUID="395a540f-a004-4d25-b209-c65b1e1373e9" containerName="registry-server" containerID="cri-o://c2eed5b2627e2078370c3d28247b71ac19c40265b10499248e7a0081fcc74c2a" gracePeriod=2 Oct 02 09:56:32 crc kubenswrapper[4722]: I1002 09:56:32.622800 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z5cdd" Oct 02 09:56:32 crc kubenswrapper[4722]: I1002 09:56:32.763217 4722 generic.go:334] "Generic (PLEG): container finished" podID="395a540f-a004-4d25-b209-c65b1e1373e9" containerID="c2eed5b2627e2078370c3d28247b71ac19c40265b10499248e7a0081fcc74c2a" exitCode=0 Oct 02 09:56:32 crc kubenswrapper[4722]: I1002 09:56:32.763287 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z5cdd" event={"ID":"395a540f-a004-4d25-b209-c65b1e1373e9","Type":"ContainerDied","Data":"c2eed5b2627e2078370c3d28247b71ac19c40265b10499248e7a0081fcc74c2a"} Oct 02 09:56:32 crc kubenswrapper[4722]: I1002 09:56:32.763358 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z5cdd" event={"ID":"395a540f-a004-4d25-b209-c65b1e1373e9","Type":"ContainerDied","Data":"d14305f249c95d58f888ba82ae4c55a35fbed523dc54df7f2ac555fe59637275"} Oct 02 09:56:32 crc kubenswrapper[4722]: I1002 09:56:32.763378 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z5cdd" Oct 02 09:56:32 crc kubenswrapper[4722]: I1002 09:56:32.763385 4722 scope.go:117] "RemoveContainer" containerID="c2eed5b2627e2078370c3d28247b71ac19c40265b10499248e7a0081fcc74c2a" Oct 02 09:56:32 crc kubenswrapper[4722]: I1002 09:56:32.782951 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s85vd\" (UniqueName: \"kubernetes.io/projected/395a540f-a004-4d25-b209-c65b1e1373e9-kube-api-access-s85vd\") pod \"395a540f-a004-4d25-b209-c65b1e1373e9\" (UID: \"395a540f-a004-4d25-b209-c65b1e1373e9\") " Oct 02 09:56:32 crc kubenswrapper[4722]: I1002 09:56:32.783375 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/395a540f-a004-4d25-b209-c65b1e1373e9-utilities\") pod \"395a540f-a004-4d25-b209-c65b1e1373e9\" (UID: \"395a540f-a004-4d25-b209-c65b1e1373e9\") " Oct 02 09:56:32 crc kubenswrapper[4722]: I1002 09:56:32.783556 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/395a540f-a004-4d25-b209-c65b1e1373e9-catalog-content\") pod \"395a540f-a004-4d25-b209-c65b1e1373e9\" (UID: \"395a540f-a004-4d25-b209-c65b1e1373e9\") " Oct 02 09:56:32 crc kubenswrapper[4722]: I1002 09:56:32.784304 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/395a540f-a004-4d25-b209-c65b1e1373e9-utilities" (OuterVolumeSpecName: "utilities") pod "395a540f-a004-4d25-b209-c65b1e1373e9" (UID: "395a540f-a004-4d25-b209-c65b1e1373e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:56:32 crc kubenswrapper[4722]: I1002 09:56:32.787324 4722 scope.go:117] "RemoveContainer" containerID="7c9cf60508ac94ab183e5b5539b4e238e5fb926f862d1ed7394d18678e49d311" Oct 02 09:56:32 crc kubenswrapper[4722]: I1002 09:56:32.794576 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/395a540f-a004-4d25-b209-c65b1e1373e9-kube-api-access-s85vd" (OuterVolumeSpecName: "kube-api-access-s85vd") pod "395a540f-a004-4d25-b209-c65b1e1373e9" (UID: "395a540f-a004-4d25-b209-c65b1e1373e9"). InnerVolumeSpecName "kube-api-access-s85vd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:56:32 crc kubenswrapper[4722]: I1002 09:56:32.824431 4722 scope.go:117] "RemoveContainer" containerID="107a7cc16fdf50735ad7aa43003a7a26ed5d968a0ad547c1bd9bc5503fb82ff2" Oct 02 09:56:32 crc kubenswrapper[4722]: I1002 09:56:32.832626 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/395a540f-a004-4d25-b209-c65b1e1373e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "395a540f-a004-4d25-b209-c65b1e1373e9" (UID: "395a540f-a004-4d25-b209-c65b1e1373e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:56:32 crc kubenswrapper[4722]: I1002 09:56:32.841201 4722 scope.go:117] "RemoveContainer" containerID="c2eed5b2627e2078370c3d28247b71ac19c40265b10499248e7a0081fcc74c2a" Oct 02 09:56:32 crc kubenswrapper[4722]: E1002 09:56:32.841870 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2eed5b2627e2078370c3d28247b71ac19c40265b10499248e7a0081fcc74c2a\": container with ID starting with c2eed5b2627e2078370c3d28247b71ac19c40265b10499248e7a0081fcc74c2a not found: ID does not exist" containerID="c2eed5b2627e2078370c3d28247b71ac19c40265b10499248e7a0081fcc74c2a" Oct 02 09:56:32 crc kubenswrapper[4722]: I1002 09:56:32.841907 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2eed5b2627e2078370c3d28247b71ac19c40265b10499248e7a0081fcc74c2a"} err="failed to get container status \"c2eed5b2627e2078370c3d28247b71ac19c40265b10499248e7a0081fcc74c2a\": rpc error: code = NotFound desc = could not find container \"c2eed5b2627e2078370c3d28247b71ac19c40265b10499248e7a0081fcc74c2a\": container with ID starting with c2eed5b2627e2078370c3d28247b71ac19c40265b10499248e7a0081fcc74c2a not found: ID does not exist" Oct 02 09:56:32 crc kubenswrapper[4722]: I1002 09:56:32.841930 4722 scope.go:117] "RemoveContainer" containerID="7c9cf60508ac94ab183e5b5539b4e238e5fb926f862d1ed7394d18678e49d311" Oct 02 09:56:32 crc kubenswrapper[4722]: E1002 09:56:32.842209 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c9cf60508ac94ab183e5b5539b4e238e5fb926f862d1ed7394d18678e49d311\": container with ID starting with 7c9cf60508ac94ab183e5b5539b4e238e5fb926f862d1ed7394d18678e49d311 not found: ID does not exist" containerID="7c9cf60508ac94ab183e5b5539b4e238e5fb926f862d1ed7394d18678e49d311" Oct 02 09:56:32 crc kubenswrapper[4722]: I1002 09:56:32.842232 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c9cf60508ac94ab183e5b5539b4e238e5fb926f862d1ed7394d18678e49d311"} err="failed to get container status \"7c9cf60508ac94ab183e5b5539b4e238e5fb926f862d1ed7394d18678e49d311\": rpc error: code = NotFound desc = could not find container \"7c9cf60508ac94ab183e5b5539b4e238e5fb926f862d1ed7394d18678e49d311\": container with ID starting with 7c9cf60508ac94ab183e5b5539b4e238e5fb926f862d1ed7394d18678e49d311 not found: ID does not exist" Oct 02 09:56:32 crc kubenswrapper[4722]: I1002 09:56:32.842245 4722 scope.go:117] "RemoveContainer" containerID="107a7cc16fdf50735ad7aa43003a7a26ed5d968a0ad547c1bd9bc5503fb82ff2" Oct 02 09:56:32 crc kubenswrapper[4722]: E1002 09:56:32.842715 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"107a7cc16fdf50735ad7aa43003a7a26ed5d968a0ad547c1bd9bc5503fb82ff2\": container with ID starting with 107a7cc16fdf50735ad7aa43003a7a26ed5d968a0ad547c1bd9bc5503fb82ff2 not found: ID does not exist" containerID="107a7cc16fdf50735ad7aa43003a7a26ed5d968a0ad547c1bd9bc5503fb82ff2" Oct 02 09:56:32 crc kubenswrapper[4722]: I1002 09:56:32.842736 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"107a7cc16fdf50735ad7aa43003a7a26ed5d968a0ad547c1bd9bc5503fb82ff2"} err="failed to get container status \"107a7cc16fdf50735ad7aa43003a7a26ed5d968a0ad547c1bd9bc5503fb82ff2\": rpc error: code = NotFound desc = could not find container \"107a7cc16fdf50735ad7aa43003a7a26ed5d968a0ad547c1bd9bc5503fb82ff2\": container with ID starting with 107a7cc16fdf50735ad7aa43003a7a26ed5d968a0ad547c1bd9bc5503fb82ff2 not found: ID does not exist" Oct 02 09:56:32 crc kubenswrapper[4722]: I1002 09:56:32.884796 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s85vd\" (UniqueName: \"kubernetes.io/projected/395a540f-a004-4d25-b209-c65b1e1373e9-kube-api-access-s85vd\") on node \"crc\" DevicePath \"\"" Oct 02 09:56:32 crc kubenswrapper[4722]: I1002 09:56:32.884916 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/395a540f-a004-4d25-b209-c65b1e1373e9-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 09:56:32 crc kubenswrapper[4722]: I1002 09:56:32.884970 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/395a540f-a004-4d25-b209-c65b1e1373e9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 09:56:33 crc kubenswrapper[4722]: I1002 09:56:33.107219 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z5cdd"] Oct 02 09:56:33 crc kubenswrapper[4722]: I1002 09:56:33.111938 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-z5cdd"] Oct 02 09:56:34 crc kubenswrapper[4722]: I1002 09:56:34.718009 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="395a540f-a004-4d25-b209-c65b1e1373e9" path="/var/lib/kubelet/pods/395a540f-a004-4d25-b209-c65b1e1373e9/volumes" Oct 02 09:56:40 crc kubenswrapper[4722]: I1002 09:56:40.697180 4722 patch_prober.go:28] interesting pod/machine-config-daemon-q6h76 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 09:56:40 crc kubenswrapper[4722]: I1002 09:56:40.697789 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" podUID="f662826c-e772-4f56-a85f-83971aaef356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 09:56:40 crc kubenswrapper[4722]: I1002 09:56:40.697847 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" Oct 02 09:56:40 crc kubenswrapper[4722]: I1002 09:56:40.698507 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1e9855b8e1fed4be34f938fd52af7ab8c5e0e45f7054a550ded49ce27d2cb1d2"} pod="openshift-machine-config-operator/machine-config-daemon-q6h76" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 09:56:40 crc kubenswrapper[4722]: I1002 09:56:40.698572 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" podUID="f662826c-e772-4f56-a85f-83971aaef356" containerName="machine-config-daemon" containerID="cri-o://1e9855b8e1fed4be34f938fd52af7ab8c5e0e45f7054a550ded49ce27d2cb1d2" gracePeriod=600 Oct 02 09:56:41 crc kubenswrapper[4722]: I1002 09:56:41.817434 4722 generic.go:334] "Generic (PLEG): container finished" podID="f662826c-e772-4f56-a85f-83971aaef356" containerID="1e9855b8e1fed4be34f938fd52af7ab8c5e0e45f7054a550ded49ce27d2cb1d2" exitCode=0 Oct 02 09:56:41 crc kubenswrapper[4722]: I1002 09:56:41.817506 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" event={"ID":"f662826c-e772-4f56-a85f-83971aaef356","Type":"ContainerDied","Data":"1e9855b8e1fed4be34f938fd52af7ab8c5e0e45f7054a550ded49ce27d2cb1d2"} Oct 02 09:56:41 crc kubenswrapper[4722]: I1002 09:56:41.817782 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" event={"ID":"f662826c-e772-4f56-a85f-83971aaef356","Type":"ContainerStarted","Data":"ceae4c47df969dfe7cb589f42f242be03f2c0101a539c919139be87800d32c48"} Oct 02 09:56:41 crc kubenswrapper[4722]: I1002 09:56:41.817805 4722 scope.go:117] "RemoveContainer" containerID="c62ee09a3389642011d46d3c51dac1ea6cb21564e2a50b3f560683d644ff19e5" Oct 02 09:56:50 crc kubenswrapper[4722]: I1002 09:56:50.072491 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jgx7g"] Oct 02 09:56:50 crc kubenswrapper[4722]: E1002 09:56:50.073212 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="395a540f-a004-4d25-b209-c65b1e1373e9" containerName="extract-utilities" Oct 02 09:56:50 crc kubenswrapper[4722]: I1002 09:56:50.073224 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="395a540f-a004-4d25-b209-c65b1e1373e9" containerName="extract-utilities" Oct 02 09:56:50 crc kubenswrapper[4722]: E1002 09:56:50.073237 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="918048fd-59ef-4782-a241-ca6ebc9e9d6b" containerName="extract-utilities" Oct 02 09:56:50 crc kubenswrapper[4722]: I1002 09:56:50.073244 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="918048fd-59ef-4782-a241-ca6ebc9e9d6b" containerName="extract-utilities" Oct 02 09:56:50 crc kubenswrapper[4722]: E1002 09:56:50.073258 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="395a540f-a004-4d25-b209-c65b1e1373e9" containerName="registry-server" Oct 02 09:56:50 crc kubenswrapper[4722]: I1002 09:56:50.073266 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="395a540f-a004-4d25-b209-c65b1e1373e9" containerName="registry-server" Oct 02 09:56:50 crc kubenswrapper[4722]: E1002 09:56:50.073287 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="918048fd-59ef-4782-a241-ca6ebc9e9d6b" containerName="extract-content" Oct 02 09:56:50 crc kubenswrapper[4722]: I1002 09:56:50.073299 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="918048fd-59ef-4782-a241-ca6ebc9e9d6b" containerName="extract-content" Oct 02 09:56:50 crc kubenswrapper[4722]: E1002 09:56:50.073331 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="918048fd-59ef-4782-a241-ca6ebc9e9d6b" containerName="registry-server" Oct 02 09:56:50 crc kubenswrapper[4722]: I1002 09:56:50.073341 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="918048fd-59ef-4782-a241-ca6ebc9e9d6b" containerName="registry-server" Oct 02 09:56:50 crc kubenswrapper[4722]: E1002 09:56:50.073353 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="395a540f-a004-4d25-b209-c65b1e1373e9" containerName="extract-content" Oct 02 09:56:50 crc kubenswrapper[4722]: I1002 09:56:50.073359 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="395a540f-a004-4d25-b209-c65b1e1373e9" containerName="extract-content" Oct 02 09:56:50 crc kubenswrapper[4722]: I1002 09:56:50.073475 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="395a540f-a004-4d25-b209-c65b1e1373e9" containerName="registry-server" Oct 02 09:56:50 crc kubenswrapper[4722]: I1002 09:56:50.073492 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="918048fd-59ef-4782-a241-ca6ebc9e9d6b" containerName="registry-server" Oct 02 09:56:50 crc kubenswrapper[4722]: I1002 09:56:50.074305 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jgx7g" Oct 02 09:56:50 crc kubenswrapper[4722]: I1002 09:56:50.095729 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jgx7g"] Oct 02 09:56:50 crc kubenswrapper[4722]: I1002 09:56:50.226827 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7749f1af-e8be-4ea4-8ada-9b6367176b90-catalog-content\") pod \"redhat-marketplace-jgx7g\" (UID: \"7749f1af-e8be-4ea4-8ada-9b6367176b90\") " pod="openshift-marketplace/redhat-marketplace-jgx7g" Oct 02 09:56:50 crc kubenswrapper[4722]: I1002 09:56:50.226900 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7749f1af-e8be-4ea4-8ada-9b6367176b90-utilities\") pod \"redhat-marketplace-jgx7g\" (UID: \"7749f1af-e8be-4ea4-8ada-9b6367176b90\") " pod="openshift-marketplace/redhat-marketplace-jgx7g" Oct 02 09:56:50 crc kubenswrapper[4722]: I1002 09:56:50.226941 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88mdd\" (UniqueName: \"kubernetes.io/projected/7749f1af-e8be-4ea4-8ada-9b6367176b90-kube-api-access-88mdd\") pod \"redhat-marketplace-jgx7g\" (UID: \"7749f1af-e8be-4ea4-8ada-9b6367176b90\") " pod="openshift-marketplace/redhat-marketplace-jgx7g" Oct 02 09:56:50 crc kubenswrapper[4722]: I1002 09:56:50.230542 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qhg8v_98e4ffb9-0cc0-4196-8f18-08f3a5827133/util/0.log" Oct 02 09:56:50 crc kubenswrapper[4722]: I1002 09:56:50.328257 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7749f1af-e8be-4ea4-8ada-9b6367176b90-catalog-content\") pod \"redhat-marketplace-jgx7g\" (UID: \"7749f1af-e8be-4ea4-8ada-9b6367176b90\") " pod="openshift-marketplace/redhat-marketplace-jgx7g" Oct 02 09:56:50 crc kubenswrapper[4722]: I1002 09:56:50.328349 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7749f1af-e8be-4ea4-8ada-9b6367176b90-utilities\") pod \"redhat-marketplace-jgx7g\" (UID: \"7749f1af-e8be-4ea4-8ada-9b6367176b90\") " pod="openshift-marketplace/redhat-marketplace-jgx7g" Oct 02 09:56:50 crc kubenswrapper[4722]: I1002 09:56:50.328392 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88mdd\" (UniqueName: \"kubernetes.io/projected/7749f1af-e8be-4ea4-8ada-9b6367176b90-kube-api-access-88mdd\") pod \"redhat-marketplace-jgx7g\" (UID: \"7749f1af-e8be-4ea4-8ada-9b6367176b90\") " pod="openshift-marketplace/redhat-marketplace-jgx7g" Oct 02 09:56:50 crc kubenswrapper[4722]: I1002 09:56:50.328925 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7749f1af-e8be-4ea4-8ada-9b6367176b90-catalog-content\") pod \"redhat-marketplace-jgx7g\" (UID: \"7749f1af-e8be-4ea4-8ada-9b6367176b90\") " pod="openshift-marketplace/redhat-marketplace-jgx7g" Oct 02 09:56:50 crc kubenswrapper[4722]: I1002 09:56:50.329043 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7749f1af-e8be-4ea4-8ada-9b6367176b90-utilities\") pod \"redhat-marketplace-jgx7g\" (UID: \"7749f1af-e8be-4ea4-8ada-9b6367176b90\") " pod="openshift-marketplace/redhat-marketplace-jgx7g" Oct 02 09:56:50 crc kubenswrapper[4722]: I1002 09:56:50.353207 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88mdd\" (UniqueName: \"kubernetes.io/projected/7749f1af-e8be-4ea4-8ada-9b6367176b90-kube-api-access-88mdd\") pod \"redhat-marketplace-jgx7g\" (UID: \"7749f1af-e8be-4ea4-8ada-9b6367176b90\") " pod="openshift-marketplace/redhat-marketplace-jgx7g" Oct 02 09:56:50 crc kubenswrapper[4722]: I1002 09:56:50.392662 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jgx7g" Oct 02 09:56:50 crc kubenswrapper[4722]: I1002 09:56:50.477253 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qhg8v_98e4ffb9-0cc0-4196-8f18-08f3a5827133/pull/0.log" Oct 02 09:56:50 crc kubenswrapper[4722]: I1002 09:56:50.512456 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qhg8v_98e4ffb9-0cc0-4196-8f18-08f3a5827133/util/0.log" Oct 02 09:56:50 crc kubenswrapper[4722]: I1002 09:56:50.525220 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qhg8v_98e4ffb9-0cc0-4196-8f18-08f3a5827133/pull/0.log" Oct 02 09:56:50 crc kubenswrapper[4722]: I1002 09:56:50.764757 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qhg8v_98e4ffb9-0cc0-4196-8f18-08f3a5827133/util/0.log" Oct 02 09:56:50 crc kubenswrapper[4722]: I1002 09:56:50.787955 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qhg8v_98e4ffb9-0cc0-4196-8f18-08f3a5827133/extract/0.log" Oct 02 09:56:50 crc kubenswrapper[4722]: I1002 09:56:50.804082 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qhg8v_98e4ffb9-0cc0-4196-8f18-08f3a5827133/pull/0.log" Oct 02 09:56:50 crc kubenswrapper[4722]: I1002 09:56:50.852953 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jgx7g"] Oct 02 09:56:50 crc kubenswrapper[4722]: I1002 09:56:50.870454 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jgx7g" event={"ID":"7749f1af-e8be-4ea4-8ada-9b6367176b90","Type":"ContainerStarted","Data":"3960e8b24fc32de2b820351c2a46d1ff5621d5ae198f7ed7fe3d25b07dc8e305"} Oct 02 09:56:50 crc kubenswrapper[4722]: I1002 09:56:50.976655 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rfdss_abd654b8-cf56-4df7-840f-50f39edc1d1e/extract-utilities/0.log" Oct 02 09:56:51 crc kubenswrapper[4722]: I1002 09:56:51.134642 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rfdss_abd654b8-cf56-4df7-840f-50f39edc1d1e/extract-content/0.log" Oct 02 09:56:51 crc kubenswrapper[4722]: I1002 09:56:51.137187 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rfdss_abd654b8-cf56-4df7-840f-50f39edc1d1e/extract-content/0.log" Oct 02 09:56:51 crc kubenswrapper[4722]: I1002 09:56:51.174468 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rfdss_abd654b8-cf56-4df7-840f-50f39edc1d1e/extract-utilities/0.log" Oct 02 09:56:51 crc kubenswrapper[4722]: I1002 09:56:51.332626 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rfdss_abd654b8-cf56-4df7-840f-50f39edc1d1e/extract-content/0.log" Oct 02 09:56:51 crc kubenswrapper[4722]: I1002 09:56:51.357429 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rfdss_abd654b8-cf56-4df7-840f-50f39edc1d1e/extract-utilities/0.log" Oct 02 09:56:51 crc kubenswrapper[4722]: I1002 09:56:51.524735 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wcbff_1aba904b-b2dc-4eaf-b466-36c73f90de8a/extract-utilities/0.log" Oct 02 09:56:51 crc kubenswrapper[4722]: I1002 09:56:51.726956 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rfdss_abd654b8-cf56-4df7-840f-50f39edc1d1e/registry-server/0.log" Oct 02 09:56:51 crc kubenswrapper[4722]: I1002 09:56:51.744243 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wcbff_1aba904b-b2dc-4eaf-b466-36c73f90de8a/extract-utilities/0.log" Oct 02 09:56:51 crc kubenswrapper[4722]: I1002 09:56:51.761453 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wcbff_1aba904b-b2dc-4eaf-b466-36c73f90de8a/extract-content/0.log" Oct 02 09:56:51 crc kubenswrapper[4722]: I1002 09:56:51.764054 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wcbff_1aba904b-b2dc-4eaf-b466-36c73f90de8a/extract-content/0.log" Oct 02 09:56:51 crc kubenswrapper[4722]: I1002 09:56:51.877058 4722 generic.go:334] "Generic (PLEG): container finished" podID="7749f1af-e8be-4ea4-8ada-9b6367176b90" containerID="cddcc1418ccf20cbc51a07f8adae311fde00650fed903477b771c6ec71556bed" exitCode=0 Oct 02 09:56:51 crc kubenswrapper[4722]: I1002 09:56:51.877111 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jgx7g" event={"ID":"7749f1af-e8be-4ea4-8ada-9b6367176b90","Type":"ContainerDied","Data":"cddcc1418ccf20cbc51a07f8adae311fde00650fed903477b771c6ec71556bed"} Oct 02 09:56:51 crc kubenswrapper[4722]: I1002 09:56:51.963741 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wcbff_1aba904b-b2dc-4eaf-b466-36c73f90de8a/extract-utilities/0.log" Oct 02 09:56:51 crc kubenswrapper[4722]: I1002 09:56:51.977267 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wcbff_1aba904b-b2dc-4eaf-b466-36c73f90de8a/extract-content/0.log" Oct 02 09:56:52 crc kubenswrapper[4722]: I1002 09:56:52.127987 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-nkmjs_bdbbc0cf-ff76-4e3a-903d-e1dff4fb7ba8/marketplace-operator/0.log" Oct 02 09:56:52 crc kubenswrapper[4722]: I1002 09:56:52.327737 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pc9sc_f7eec357-442f-49c5-a539-067608f33f4f/extract-utilities/0.log" Oct 02 09:56:52 crc kubenswrapper[4722]: I1002 09:56:52.362418 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wcbff_1aba904b-b2dc-4eaf-b466-36c73f90de8a/registry-server/0.log" Oct 02 09:56:52 crc kubenswrapper[4722]: I1002 09:56:52.490842 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pc9sc_f7eec357-442f-49c5-a539-067608f33f4f/extract-content/0.log" Oct 02 09:56:52 crc kubenswrapper[4722]: I1002 09:56:52.500237 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pc9sc_f7eec357-442f-49c5-a539-067608f33f4f/extract-utilities/0.log" Oct 02 09:56:52 crc kubenswrapper[4722]: I1002 09:56:52.543729 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pc9sc_f7eec357-442f-49c5-a539-067608f33f4f/extract-content/0.log" Oct 02 09:56:52 crc kubenswrapper[4722]: I1002 09:56:52.743631 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pc9sc_f7eec357-442f-49c5-a539-067608f33f4f/extract-content/0.log" Oct 02 09:56:52 crc kubenswrapper[4722]: I1002 09:56:52.764101 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pc9sc_f7eec357-442f-49c5-a539-067608f33f4f/extract-utilities/0.log" Oct 02 09:56:52 crc kubenswrapper[4722]: I1002 09:56:52.810671 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pc9sc_f7eec357-442f-49c5-a539-067608f33f4f/registry-server/0.log" Oct 02 09:56:52 crc kubenswrapper[4722]: I1002 09:56:52.885165 4722 generic.go:334] "Generic (PLEG): container finished" podID="7749f1af-e8be-4ea4-8ada-9b6367176b90" containerID="80d9e30df35b6cc4db91003fa3574fa841990db42dd1fe477dc198f48bcea80a" exitCode=0 Oct 02 09:56:52 crc kubenswrapper[4722]: I1002 09:56:52.885213 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jgx7g" event={"ID":"7749f1af-e8be-4ea4-8ada-9b6367176b90","Type":"ContainerDied","Data":"80d9e30df35b6cc4db91003fa3574fa841990db42dd1fe477dc198f48bcea80a"} Oct 02 09:56:52 crc kubenswrapper[4722]: I1002 09:56:52.946818 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k6qks_987e526d-2094-4eb5-a911-c58aee2b53a4/extract-utilities/0.log" Oct 02 09:56:53 crc kubenswrapper[4722]: I1002 09:56:53.099703 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k6qks_987e526d-2094-4eb5-a911-c58aee2b53a4/extract-content/0.log" Oct 02 09:56:53 crc kubenswrapper[4722]: I1002 09:56:53.117780 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k6qks_987e526d-2094-4eb5-a911-c58aee2b53a4/extract-utilities/0.log" Oct 02 09:56:53 crc kubenswrapper[4722]: I1002 09:56:53.117850 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k6qks_987e526d-2094-4eb5-a911-c58aee2b53a4/extract-content/0.log" Oct 02 09:56:53 crc kubenswrapper[4722]: I1002 09:56:53.333804 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k6qks_987e526d-2094-4eb5-a911-c58aee2b53a4/extract-utilities/0.log" Oct 02 09:56:53 crc kubenswrapper[4722]: I1002 09:56:53.346181 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k6qks_987e526d-2094-4eb5-a911-c58aee2b53a4/extract-content/0.log" Oct 02 09:56:53 crc kubenswrapper[4722]: I1002 09:56:53.691357 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k6qks_987e526d-2094-4eb5-a911-c58aee2b53a4/registry-server/0.log" Oct 02 09:56:53 crc kubenswrapper[4722]: I1002 09:56:53.893087 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jgx7g" event={"ID":"7749f1af-e8be-4ea4-8ada-9b6367176b90","Type":"ContainerStarted","Data":"6341db45cd32a976379bcf5db184ee9208c6407d978c5cfad6f73aeb1e32019d"} Oct 02 09:56:53 crc kubenswrapper[4722]: I1002 09:56:53.918120 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jgx7g" podStartSLOduration=2.314839224 podStartE2EDuration="3.918094342s" podCreationTimestamp="2025-10-02 09:56:50 +0000 UTC" firstStartedPulling="2025-10-02 09:56:51.878589255 +0000 UTC m=+1467.915342225" lastFinishedPulling="2025-10-02 09:56:53.481844363 +0000 UTC m=+1469.518597343" observedRunningTime="2025-10-02 09:56:53.913288992 +0000 UTC m=+1469.950041992" watchObservedRunningTime="2025-10-02 09:56:53.918094342 +0000 UTC m=+1469.954847322" Oct 02 09:57:00 crc kubenswrapper[4722]: I1002 09:57:00.392887 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jgx7g" Oct 02 09:57:00 crc kubenswrapper[4722]: I1002 09:57:00.393493 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jgx7g" Oct 02 09:57:00 crc kubenswrapper[4722]: I1002 09:57:00.437150 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jgx7g" Oct 02 09:57:00 crc kubenswrapper[4722]: I1002 09:57:00.977606 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jgx7g" Oct 02 09:57:01 crc kubenswrapper[4722]: I1002 09:57:01.019324 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jgx7g"] Oct 02 09:57:02 crc kubenswrapper[4722]: I1002 09:57:02.948918 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jgx7g" podUID="7749f1af-e8be-4ea4-8ada-9b6367176b90" containerName="registry-server" containerID="cri-o://6341db45cd32a976379bcf5db184ee9208c6407d978c5cfad6f73aeb1e32019d" gracePeriod=2 Oct 02 09:57:03 crc kubenswrapper[4722]: I1002 09:57:03.318718 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jgx7g" Oct 02 09:57:03 crc kubenswrapper[4722]: I1002 09:57:03.499094 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7749f1af-e8be-4ea4-8ada-9b6367176b90-utilities\") pod \"7749f1af-e8be-4ea4-8ada-9b6367176b90\" (UID: \"7749f1af-e8be-4ea4-8ada-9b6367176b90\") " Oct 02 09:57:03 crc kubenswrapper[4722]: I1002 09:57:03.499209 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88mdd\" (UniqueName: \"kubernetes.io/projected/7749f1af-e8be-4ea4-8ada-9b6367176b90-kube-api-access-88mdd\") pod \"7749f1af-e8be-4ea4-8ada-9b6367176b90\" (UID: \"7749f1af-e8be-4ea4-8ada-9b6367176b90\") " Oct 02 09:57:03 crc kubenswrapper[4722]: I1002 09:57:03.499308 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7749f1af-e8be-4ea4-8ada-9b6367176b90-catalog-content\") pod \"7749f1af-e8be-4ea4-8ada-9b6367176b90\" (UID: \"7749f1af-e8be-4ea4-8ada-9b6367176b90\") " Oct 02 09:57:03 crc kubenswrapper[4722]: I1002 09:57:03.500009 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7749f1af-e8be-4ea4-8ada-9b6367176b90-utilities" (OuterVolumeSpecName: "utilities") pod "7749f1af-e8be-4ea4-8ada-9b6367176b90" (UID: "7749f1af-e8be-4ea4-8ada-9b6367176b90"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:57:03 crc kubenswrapper[4722]: I1002 09:57:03.505114 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7749f1af-e8be-4ea4-8ada-9b6367176b90-kube-api-access-88mdd" (OuterVolumeSpecName: "kube-api-access-88mdd") pod "7749f1af-e8be-4ea4-8ada-9b6367176b90" (UID: "7749f1af-e8be-4ea4-8ada-9b6367176b90"). InnerVolumeSpecName "kube-api-access-88mdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:57:03 crc kubenswrapper[4722]: I1002 09:57:03.516171 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7749f1af-e8be-4ea4-8ada-9b6367176b90-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7749f1af-e8be-4ea4-8ada-9b6367176b90" (UID: "7749f1af-e8be-4ea4-8ada-9b6367176b90"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:57:03 crc kubenswrapper[4722]: I1002 09:57:03.600686 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7749f1af-e8be-4ea4-8ada-9b6367176b90-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 09:57:03 crc kubenswrapper[4722]: I1002 09:57:03.600743 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7749f1af-e8be-4ea4-8ada-9b6367176b90-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 09:57:03 crc kubenswrapper[4722]: I1002 09:57:03.600765 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88mdd\" (UniqueName: \"kubernetes.io/projected/7749f1af-e8be-4ea4-8ada-9b6367176b90-kube-api-access-88mdd\") on node \"crc\" DevicePath \"\"" Oct 02 09:57:03 crc kubenswrapper[4722]: I1002 09:57:03.956726 4722 generic.go:334] "Generic (PLEG): container finished" podID="7749f1af-e8be-4ea4-8ada-9b6367176b90" containerID="6341db45cd32a976379bcf5db184ee9208c6407d978c5cfad6f73aeb1e32019d" exitCode=0 Oct 02 09:57:03 crc kubenswrapper[4722]: I1002 09:57:03.956775 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jgx7g" event={"ID":"7749f1af-e8be-4ea4-8ada-9b6367176b90","Type":"ContainerDied","Data":"6341db45cd32a976379bcf5db184ee9208c6407d978c5cfad6f73aeb1e32019d"} Oct 02 09:57:03 crc kubenswrapper[4722]: I1002 09:57:03.956797 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jgx7g" Oct 02 09:57:03 crc kubenswrapper[4722]: I1002 09:57:03.956805 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jgx7g" event={"ID":"7749f1af-e8be-4ea4-8ada-9b6367176b90","Type":"ContainerDied","Data":"3960e8b24fc32de2b820351c2a46d1ff5621d5ae198f7ed7fe3d25b07dc8e305"} Oct 02 09:57:03 crc kubenswrapper[4722]: I1002 09:57:03.956825 4722 scope.go:117] "RemoveContainer" containerID="6341db45cd32a976379bcf5db184ee9208c6407d978c5cfad6f73aeb1e32019d" Oct 02 09:57:03 crc kubenswrapper[4722]: I1002 09:57:03.974747 4722 scope.go:117] "RemoveContainer" containerID="80d9e30df35b6cc4db91003fa3574fa841990db42dd1fe477dc198f48bcea80a" Oct 02 09:57:03 crc kubenswrapper[4722]: I1002 09:57:03.991224 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jgx7g"] Oct 02 09:57:03 crc kubenswrapper[4722]: I1002 09:57:03.995921 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jgx7g"] Oct 02 09:57:03 crc kubenswrapper[4722]: I1002 09:57:03.996814 4722 scope.go:117] "RemoveContainer" containerID="cddcc1418ccf20cbc51a07f8adae311fde00650fed903477b771c6ec71556bed" Oct 02 09:57:04 crc kubenswrapper[4722]: I1002 09:57:04.017159 4722 scope.go:117] "RemoveContainer" containerID="6341db45cd32a976379bcf5db184ee9208c6407d978c5cfad6f73aeb1e32019d" Oct 02 09:57:04 crc kubenswrapper[4722]: E1002 09:57:04.017783 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6341db45cd32a976379bcf5db184ee9208c6407d978c5cfad6f73aeb1e32019d\": container with ID starting with 6341db45cd32a976379bcf5db184ee9208c6407d978c5cfad6f73aeb1e32019d not found: ID does not exist" containerID="6341db45cd32a976379bcf5db184ee9208c6407d978c5cfad6f73aeb1e32019d" Oct 02 09:57:04 crc kubenswrapper[4722]: I1002 09:57:04.017825 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6341db45cd32a976379bcf5db184ee9208c6407d978c5cfad6f73aeb1e32019d"} err="failed to get container status \"6341db45cd32a976379bcf5db184ee9208c6407d978c5cfad6f73aeb1e32019d\": rpc error: code = NotFound desc = could not find container \"6341db45cd32a976379bcf5db184ee9208c6407d978c5cfad6f73aeb1e32019d\": container with ID starting with 6341db45cd32a976379bcf5db184ee9208c6407d978c5cfad6f73aeb1e32019d not found: ID does not exist" Oct 02 09:57:04 crc kubenswrapper[4722]: I1002 09:57:04.017862 4722 scope.go:117] "RemoveContainer" containerID="80d9e30df35b6cc4db91003fa3574fa841990db42dd1fe477dc198f48bcea80a" Oct 02 09:57:04 crc kubenswrapper[4722]: E1002 09:57:04.018212 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80d9e30df35b6cc4db91003fa3574fa841990db42dd1fe477dc198f48bcea80a\": container with ID starting with 80d9e30df35b6cc4db91003fa3574fa841990db42dd1fe477dc198f48bcea80a not found: ID does not exist" containerID="80d9e30df35b6cc4db91003fa3574fa841990db42dd1fe477dc198f48bcea80a" Oct 02 09:57:04 crc kubenswrapper[4722]: I1002 09:57:04.018243 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80d9e30df35b6cc4db91003fa3574fa841990db42dd1fe477dc198f48bcea80a"} err="failed to get container status \"80d9e30df35b6cc4db91003fa3574fa841990db42dd1fe477dc198f48bcea80a\": rpc error: code = NotFound desc = could not find container \"80d9e30df35b6cc4db91003fa3574fa841990db42dd1fe477dc198f48bcea80a\": container with ID starting with 80d9e30df35b6cc4db91003fa3574fa841990db42dd1fe477dc198f48bcea80a not found: ID does not exist" Oct 02 09:57:04 crc kubenswrapper[4722]: I1002 09:57:04.018264 4722 scope.go:117] "RemoveContainer" containerID="cddcc1418ccf20cbc51a07f8adae311fde00650fed903477b771c6ec71556bed" Oct 02 09:57:04 crc kubenswrapper[4722]: E1002 09:57:04.018557 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cddcc1418ccf20cbc51a07f8adae311fde00650fed903477b771c6ec71556bed\": container with ID starting with cddcc1418ccf20cbc51a07f8adae311fde00650fed903477b771c6ec71556bed not found: ID does not exist" containerID="cddcc1418ccf20cbc51a07f8adae311fde00650fed903477b771c6ec71556bed" Oct 02 09:57:04 crc kubenswrapper[4722]: I1002 09:57:04.018583 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cddcc1418ccf20cbc51a07f8adae311fde00650fed903477b771c6ec71556bed"} err="failed to get container status \"cddcc1418ccf20cbc51a07f8adae311fde00650fed903477b771c6ec71556bed\": rpc error: code = NotFound desc = could not find container \"cddcc1418ccf20cbc51a07f8adae311fde00650fed903477b771c6ec71556bed\": container with ID starting with cddcc1418ccf20cbc51a07f8adae311fde00650fed903477b771c6ec71556bed not found: ID does not exist" Oct 02 09:57:04 crc kubenswrapper[4722]: I1002 09:57:04.717360 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7749f1af-e8be-4ea4-8ada-9b6367176b90" path="/var/lib/kubelet/pods/7749f1af-e8be-4ea4-8ada-9b6367176b90/volumes" Oct 02 09:57:25 crc kubenswrapper[4722]: I1002 09:57:25.863180 4722 scope.go:117] "RemoveContainer" containerID="2a1cea57a939aa89a46fa62cd3100e07fd1ea4d89511fcf4dd690d2ea2b2cdd7" Oct 02 09:57:25 crc kubenswrapper[4722]: I1002 09:57:25.890608 4722 scope.go:117] "RemoveContainer" containerID="2bfbc801e2e046d44aa0cb629d3510aa5e15ce16c4aab501d753cc685f1eafa3" Oct 02 09:57:25 crc kubenswrapper[4722]: I1002 09:57:25.906413 4722 scope.go:117] "RemoveContainer" containerID="22f402bd2bf520867d8f8c91b2db03d128c3da48c70db6574dc05106c174a419" Oct 02 09:57:25 crc kubenswrapper[4722]: I1002 09:57:25.925230 4722 scope.go:117] "RemoveContainer" containerID="e2750b9bfe253288155d81b2d5c0baf1568698fa6c79146ffbbbb7766e0abc8e" Oct 02 09:57:25 crc kubenswrapper[4722]: I1002 09:57:25.961191 4722 scope.go:117] "RemoveContainer" containerID="2074b690c7bf32aef78eb9794e0be9a0c85d7a196b3f60274bdc74d2d054361b" Oct 02 09:57:26 crc kubenswrapper[4722]: I1002 09:57:26.012483 4722 scope.go:117] "RemoveContainer" containerID="0303adf9b7d93a00abc9e84ff4975126e8bd2d6793f3cc4a85d2782cf8a6ab6e" Oct 02 09:57:26 crc kubenswrapper[4722]: I1002 09:57:26.035310 4722 scope.go:117] "RemoveContainer" containerID="27a9f8b1e9825b0301e16a4e556b1b9fca0c6d678b5b0f4d2492bcfc055714d4" Oct 02 09:57:26 crc kubenswrapper[4722]: I1002 09:57:26.054348 4722 scope.go:117] "RemoveContainer" containerID="8f261b8ba30dd4058697c6f68bf43b8e9660f4bb5f5a7a8739f8777e51225e11" Oct 02 09:57:57 crc kubenswrapper[4722]: I1002 09:57:57.259588 4722 generic.go:334] "Generic (PLEG): container finished" podID="739d7c43-bf53-4d6c-ac3b-cf795a7efdd3" containerID="89902c6c5f3d7a96f00deebb3854cb5d75cb8ab0a81566f4e2a1ee031337ab4e" exitCode=0 Oct 02 09:57:57 crc kubenswrapper[4722]: I1002 09:57:57.259649 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ggvgk/must-gather-bwg5z" event={"ID":"739d7c43-bf53-4d6c-ac3b-cf795a7efdd3","Type":"ContainerDied","Data":"89902c6c5f3d7a96f00deebb3854cb5d75cb8ab0a81566f4e2a1ee031337ab4e"} Oct 02 09:57:57 crc kubenswrapper[4722]: I1002 09:57:57.260636 4722 scope.go:117] "RemoveContainer" containerID="89902c6c5f3d7a96f00deebb3854cb5d75cb8ab0a81566f4e2a1ee031337ab4e" Oct 02 09:57:57 crc kubenswrapper[4722]: I1002 09:57:57.390972 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ggvgk_must-gather-bwg5z_739d7c43-bf53-4d6c-ac3b-cf795a7efdd3/gather/0.log" Oct 02 09:58:05 crc kubenswrapper[4722]: I1002 09:58:05.617691 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ggvgk/must-gather-bwg5z"] Oct 02 09:58:05 crc kubenswrapper[4722]: I1002 09:58:05.618520 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-ggvgk/must-gather-bwg5z" podUID="739d7c43-bf53-4d6c-ac3b-cf795a7efdd3" containerName="copy" containerID="cri-o://e6ca8bb171cf475d771eaf38b7b69e4d36d6cdfb7325ba78614063142d64013b" gracePeriod=2 Oct 02 09:58:05 crc kubenswrapper[4722]: I1002 09:58:05.623173 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ggvgk/must-gather-bwg5z"] Oct 02 09:58:05 crc kubenswrapper[4722]: I1002 09:58:05.946640 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ggvgk_must-gather-bwg5z_739d7c43-bf53-4d6c-ac3b-cf795a7efdd3/copy/0.log" Oct 02 09:58:05 crc kubenswrapper[4722]: I1002 09:58:05.947451 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ggvgk/must-gather-bwg5z" Oct 02 09:58:06 crc kubenswrapper[4722]: I1002 09:58:06.079609 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/739d7c43-bf53-4d6c-ac3b-cf795a7efdd3-must-gather-output\") pod \"739d7c43-bf53-4d6c-ac3b-cf795a7efdd3\" (UID: \"739d7c43-bf53-4d6c-ac3b-cf795a7efdd3\") " Oct 02 09:58:06 crc kubenswrapper[4722]: I1002 09:58:06.079676 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mq7f\" (UniqueName: \"kubernetes.io/projected/739d7c43-bf53-4d6c-ac3b-cf795a7efdd3-kube-api-access-9mq7f\") pod \"739d7c43-bf53-4d6c-ac3b-cf795a7efdd3\" (UID: \"739d7c43-bf53-4d6c-ac3b-cf795a7efdd3\") " Oct 02 09:58:06 crc kubenswrapper[4722]: I1002 09:58:06.106477 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/739d7c43-bf53-4d6c-ac3b-cf795a7efdd3-kube-api-access-9mq7f" (OuterVolumeSpecName: "kube-api-access-9mq7f") pod "739d7c43-bf53-4d6c-ac3b-cf795a7efdd3" (UID: "739d7c43-bf53-4d6c-ac3b-cf795a7efdd3"). InnerVolumeSpecName "kube-api-access-9mq7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 09:58:06 crc kubenswrapper[4722]: I1002 09:58:06.147726 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/739d7c43-bf53-4d6c-ac3b-cf795a7efdd3-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "739d7c43-bf53-4d6c-ac3b-cf795a7efdd3" (UID: "739d7c43-bf53-4d6c-ac3b-cf795a7efdd3"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 09:58:06 crc kubenswrapper[4722]: I1002 09:58:06.180537 4722 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/739d7c43-bf53-4d6c-ac3b-cf795a7efdd3-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 02 09:58:06 crc kubenswrapper[4722]: I1002 09:58:06.180570 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mq7f\" (UniqueName: \"kubernetes.io/projected/739d7c43-bf53-4d6c-ac3b-cf795a7efdd3-kube-api-access-9mq7f\") on node \"crc\" DevicePath \"\"" Oct 02 09:58:06 crc kubenswrapper[4722]: I1002 09:58:06.311671 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ggvgk_must-gather-bwg5z_739d7c43-bf53-4d6c-ac3b-cf795a7efdd3/copy/0.log" Oct 02 09:58:06 crc kubenswrapper[4722]: I1002 09:58:06.312153 4722 generic.go:334] "Generic (PLEG): container finished" podID="739d7c43-bf53-4d6c-ac3b-cf795a7efdd3" containerID="e6ca8bb171cf475d771eaf38b7b69e4d36d6cdfb7325ba78614063142d64013b" exitCode=143 Oct 02 09:58:06 crc kubenswrapper[4722]: I1002 09:58:06.312206 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ggvgk/must-gather-bwg5z" Oct 02 09:58:06 crc kubenswrapper[4722]: I1002 09:58:06.312218 4722 scope.go:117] "RemoveContainer" containerID="e6ca8bb171cf475d771eaf38b7b69e4d36d6cdfb7325ba78614063142d64013b" Oct 02 09:58:06 crc kubenswrapper[4722]: I1002 09:58:06.327818 4722 scope.go:117] "RemoveContainer" containerID="89902c6c5f3d7a96f00deebb3854cb5d75cb8ab0a81566f4e2a1ee031337ab4e" Oct 02 09:58:06 crc kubenswrapper[4722]: I1002 09:58:06.360941 4722 scope.go:117] "RemoveContainer" containerID="e6ca8bb171cf475d771eaf38b7b69e4d36d6cdfb7325ba78614063142d64013b" Oct 02 09:58:06 crc kubenswrapper[4722]: E1002 09:58:06.363001 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6ca8bb171cf475d771eaf38b7b69e4d36d6cdfb7325ba78614063142d64013b\": container with ID starting with e6ca8bb171cf475d771eaf38b7b69e4d36d6cdfb7325ba78614063142d64013b not found: ID does not exist" containerID="e6ca8bb171cf475d771eaf38b7b69e4d36d6cdfb7325ba78614063142d64013b" Oct 02 09:58:06 crc kubenswrapper[4722]: I1002 09:58:06.363057 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6ca8bb171cf475d771eaf38b7b69e4d36d6cdfb7325ba78614063142d64013b"} err="failed to get container status \"e6ca8bb171cf475d771eaf38b7b69e4d36d6cdfb7325ba78614063142d64013b\": rpc error: code = NotFound desc = could not find container \"e6ca8bb171cf475d771eaf38b7b69e4d36d6cdfb7325ba78614063142d64013b\": container with ID starting with e6ca8bb171cf475d771eaf38b7b69e4d36d6cdfb7325ba78614063142d64013b not found: ID does not exist" Oct 02 09:58:06 crc kubenswrapper[4722]: I1002 09:58:06.363082 4722 scope.go:117] "RemoveContainer" containerID="89902c6c5f3d7a96f00deebb3854cb5d75cb8ab0a81566f4e2a1ee031337ab4e" Oct 02 09:58:06 crc kubenswrapper[4722]: E1002 09:58:06.363405 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89902c6c5f3d7a96f00deebb3854cb5d75cb8ab0a81566f4e2a1ee031337ab4e\": container with ID starting with 89902c6c5f3d7a96f00deebb3854cb5d75cb8ab0a81566f4e2a1ee031337ab4e not found: ID does not exist" containerID="89902c6c5f3d7a96f00deebb3854cb5d75cb8ab0a81566f4e2a1ee031337ab4e" Oct 02 09:58:06 crc kubenswrapper[4722]: I1002 09:58:06.363424 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89902c6c5f3d7a96f00deebb3854cb5d75cb8ab0a81566f4e2a1ee031337ab4e"} err="failed to get container status \"89902c6c5f3d7a96f00deebb3854cb5d75cb8ab0a81566f4e2a1ee031337ab4e\": rpc error: code = NotFound desc = could not find container \"89902c6c5f3d7a96f00deebb3854cb5d75cb8ab0a81566f4e2a1ee031337ab4e\": container with ID starting with 89902c6c5f3d7a96f00deebb3854cb5d75cb8ab0a81566f4e2a1ee031337ab4e not found: ID does not exist" Oct 02 09:58:06 crc kubenswrapper[4722]: I1002 09:58:06.717060 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="739d7c43-bf53-4d6c-ac3b-cf795a7efdd3" path="/var/lib/kubelet/pods/739d7c43-bf53-4d6c-ac3b-cf795a7efdd3/volumes" Oct 02 09:58:26 crc kubenswrapper[4722]: I1002 09:58:26.167922 4722 scope.go:117] "RemoveContainer" containerID="23a74c9ce6b0ff4df8f7d06b59f6b2240d95d5d712ca55a741bd3392a648e7e4" Oct 02 09:58:26 crc kubenswrapper[4722]: I1002 09:58:26.186306 4722 scope.go:117] "RemoveContainer" containerID="9c8d9cdffe3739783b9c84aadf5882794ba34aa4f179ab4efcece743b2dbc3c1" Oct 02 09:58:26 crc kubenswrapper[4722]: I1002 09:58:26.206337 4722 scope.go:117] "RemoveContainer" containerID="03658ec23435e475b875109a154a717e637a684536a983dea5d585f51fed5a12" Oct 02 09:58:26 crc kubenswrapper[4722]: I1002 09:58:26.223599 4722 scope.go:117] "RemoveContainer" containerID="813bbab6b0c394b08527dfb0e0d63037078a1a4f044f5d301bcf8ea84be58f04" Oct 02 09:58:26 crc kubenswrapper[4722]: I1002 09:58:26.252640 4722 scope.go:117] "RemoveContainer" containerID="6bc3f10e6ffbcfeb6d2aefe03a065cf85c1b0235f4091caeb2b3a328fec22120" Oct 02 09:58:26 crc kubenswrapper[4722]: I1002 09:58:26.265876 4722 scope.go:117] "RemoveContainer" containerID="914876550fa92687eacf3ef88ca106da7cae7d5ed05ec61721d9880195afa93c" Oct 02 09:58:26 crc kubenswrapper[4722]: I1002 09:58:26.280365 4722 scope.go:117] "RemoveContainer" containerID="d75f59501b769c4635f9254765268f015f311e6ef2750517e973be3d67a52f7f" Oct 02 09:58:40 crc kubenswrapper[4722]: I1002 09:58:40.697471 4722 patch_prober.go:28] interesting pod/machine-config-daemon-q6h76 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 09:58:40 crc kubenswrapper[4722]: I1002 09:58:40.697862 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" podUID="f662826c-e772-4f56-a85f-83971aaef356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 09:59:10 crc kubenswrapper[4722]: I1002 09:59:10.697233 4722 patch_prober.go:28] interesting pod/machine-config-daemon-q6h76 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 09:59:10 crc kubenswrapper[4722]: I1002 09:59:10.697966 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" podUID="f662826c-e772-4f56-a85f-83971aaef356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 09:59:40 crc kubenswrapper[4722]: I1002 09:59:40.696913 4722 patch_prober.go:28] interesting pod/machine-config-daemon-q6h76 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 09:59:40 crc kubenswrapper[4722]: I1002 09:59:40.697506 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" podUID="f662826c-e772-4f56-a85f-83971aaef356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 09:59:40 crc kubenswrapper[4722]: I1002 09:59:40.697556 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" Oct 02 09:59:40 crc kubenswrapper[4722]: I1002 09:59:40.698171 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ceae4c47df969dfe7cb589f42f242be03f2c0101a539c919139be87800d32c48"} pod="openshift-machine-config-operator/machine-config-daemon-q6h76" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 09:59:40 crc kubenswrapper[4722]: I1002 09:59:40.698224 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" podUID="f662826c-e772-4f56-a85f-83971aaef356" containerName="machine-config-daemon" containerID="cri-o://ceae4c47df969dfe7cb589f42f242be03f2c0101a539c919139be87800d32c48" gracePeriod=600 Oct 02 09:59:40 crc kubenswrapper[4722]: E1002 09:59:40.816381 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q6h76_openshift-machine-config-operator(f662826c-e772-4f56-a85f-83971aaef356)\"" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" podUID="f662826c-e772-4f56-a85f-83971aaef356" Oct 02 09:59:40 crc kubenswrapper[4722]: I1002 09:59:40.844654 4722 generic.go:334] "Generic (PLEG): container finished" podID="f662826c-e772-4f56-a85f-83971aaef356" containerID="ceae4c47df969dfe7cb589f42f242be03f2c0101a539c919139be87800d32c48" exitCode=0 Oct 02 09:59:40 crc kubenswrapper[4722]: I1002 09:59:40.844746 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" event={"ID":"f662826c-e772-4f56-a85f-83971aaef356","Type":"ContainerDied","Data":"ceae4c47df969dfe7cb589f42f242be03f2c0101a539c919139be87800d32c48"} Oct 02 09:59:40 crc kubenswrapper[4722]: I1002 09:59:40.844794 4722 scope.go:117] "RemoveContainer" containerID="1e9855b8e1fed4be34f938fd52af7ab8c5e0e45f7054a550ded49ce27d2cb1d2" Oct 02 09:59:40 crc kubenswrapper[4722]: I1002 09:59:40.845449 4722 scope.go:117] "RemoveContainer" containerID="ceae4c47df969dfe7cb589f42f242be03f2c0101a539c919139be87800d32c48" Oct 02 09:59:40 crc kubenswrapper[4722]: E1002 09:59:40.845767 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q6h76_openshift-machine-config-operator(f662826c-e772-4f56-a85f-83971aaef356)\"" pod="openshift-machine-config-operator/machine-config-daemon-q6h76" podUID="f662826c-e772-4f56-a85f-83971aaef356"